Train boosted decision tree classifier.


function model = adaBoostTrain( X0, X1, varargin )


 Train boosted decision tree classifier.

 Heavily optimized code for training Discrete or Real AdaBoost where the
 weak classifiers are decision trees. With multi-core support enabled (see
 binaryTreeTrain.m), boosting 256 depth-2 trees over 5,000 features and
 5,000 data points takes under 5 seconds, see example below. Most of the
 training time is spent in binaryTreeTrain.m.

 For more information on how to quickly boost decision trees see:
   [1] R. Appel, T. Fuchs, P. Dollár, P. Perona; "Quickly Boosting
   Decision Trees – Pruning Underachieving Features Early," ICML 2013.
 The code here implements a simple brute-force strategy with the option to
 sample features used for training each node for additional speedups.
 Further gains using the ideas from the ICML paper are possible. If you
 use this code please consider citing our ICML paper.

  model = adaBoostTrain( X0, X1, [pBoost] )

  X0         - [N0xF] negative feature vectors
  X1         - [N1xF] positive feature vectors
  pBoost     - additional params (struct or name/value pairs)
   .pTree      - ['REQ'] parameters for binaryTreeTrain
   .nWeak      - [128] number of trees to learn
   .discrete   - [1] train Discrete-AdaBoost or Real-AdaBoost
   .verbose    - [0] if true print status information

  model      - learned boosted tree classifier w the following fields
   .fids       - [K x nWeak] feature ids for each node
   .thrs       - [K x nWeak] threshold corresponding to each fid
   .child      - [K x nWeak] index of child for each node (1-indexed)
   .hs         - [K x nWeak] log ratio (.5*log(p/(1-p)) at each node
   .weights    - [K x nWeak] total sample weight at each node
   .depth      - [K x nWeak] depth of each node
   .errs       - [1 x nWeak] error for each tree (for debugging)
   .losses     - [1 x nWeak] loss after every iteration (for debugging)
   .treeDepth  - depth of all leaf nodes (or 0 if leaf depth varies)

  % output should be: 'Testing err=0.0145 fp=0.0165 fn=0.0125'
  N=5000; F=5000; sep=.01; RandStream.getGlobalStream.reset();
  xTrn=repmat(single(xTrn),[1 10]); xTst=repmat(single(xTst),[1 10]);
  model = adaBoostTrain( xTrn(hTrn==1,:), xTrn(hTrn==2,:), pBoost );
  fp = mean(adaBoostApply( xTst(hTst==1,:), model )>0);
  fn = mean(adaBoostApply( xTst(hTst==2,:), model )<0);
  fprintf('Testing err=%.4f fp=%.4f fn=%.4f\n',(fp+fn)/2,fp,fn);

 See also adaBoostApply, binaryTreeTrain, demoGenData

 Piotr's Computer Vision Matlab Toolbox      Version 3.21
 Copyright 2014 Piotr Dollar.  []
 Licensed under the Simplified BSD License [see external/bsd.txt]

Generated by m2html © 2003