adaBoostTrain

PURPOSE ^

Train boosted decision tree classifier.

SYNOPSIS ^

function model = adaBoostTrain( X0, X1, varargin )

DESCRIPTION ^

 Train boosted decision tree classifier.

 Heavily optimized code for training Discrete or Real AdaBoost where the
 weak classifiers are decision trees. With multi-core support enabled (see
 binaryTreeTrain.m), boosting 256 depth-2 trees over 5,000 features and
 5,000 data points takes under 5 seconds, see example below. Most of the
 training time is spent in binaryTreeTrain.m.

 For more information on how to quickly boost decision trees see:
   [1] R. Appel, T. Fuchs, P. Dollár, P. Perona; "Quickly Boosting
   Decision Trees – Pruning Underachieving Features Early," ICML 2013.
 The code here implements a simple brute-force strategy with the option to
 sample features used for training each node for additional speedups.
 Further gains using the ideas from the ICML paper are possible. If you
 use this code please consider citing our ICML paper.

 USAGE
  model = adaBoostTrain( X0, X1, [pBoost] )

 INPUTS
  X0         - [N0xF] negative feature vectors
  X1         - [N1xF] positive feature vectors
  pBoost     - additional params (struct or name/value pairs)
   .pTree      - ['REQ'] parameters for binaryTreeTrain
   .nWeak      - [128] number of trees to learn
   .discrete   - [1] train Discrete-AdaBoost or Real-AdaBoost
   .verbose    - [0] if true print status information

 OUTPUTS
  model      - learned boosted tree classifier w the following fields
   .fids       - [K x nWeak] feature ids for each node
   .thrs       - [K x nWeak] threshold corresponding to each fid
   .child      - [K x nWeak] index of child for each node (1-indexed)
   .hs         - [K x nWeak] log ratio (.5*log(p/(1-p)) at each node
   .weights    - [K x nWeak] total sample weight at each node
   .depth      - [K x nWeak] depth of each node
   .errs       - [1 x nWeak] error for each tree (for debugging)
   .losses     - [1 x nWeak] loss after every iteration (for debugging)
   .treeDepth  - depth of all leaf nodes (or 0 if leaf depth varies)

 EXAMPLE
  % output should be: 'Testing err=0.0145 fp=0.0165 fn=0.0125'
  N=5000; F=5000; sep=.01; RandStream.getGlobalStream.reset();
  [xTrn,hTrn,xTst,hTst]=demoGenData(N,N,2,F/10,sep,.5,0);
  xTrn=repmat(single(xTrn),[1 10]); xTst=repmat(single(xTst),[1 10]);
  pBoost=struct('nWeak',256,'verbose',16,'pTree',struct('maxDepth',2));
  model = adaBoostTrain( xTrn(hTrn==1,:), xTrn(hTrn==2,:), pBoost );
  fp = mean(adaBoostApply( xTst(hTst==1,:), model )>0);
  fn = mean(adaBoostApply( xTst(hTst==2,:), model )<0);
  fprintf('Testing err=%.4f fp=%.4f fn=%.4f\n',(fp+fn)/2,fp,fn);

 See also adaBoostApply, binaryTreeTrain, demoGenData

 Piotr's Computer Vision Matlab Toolbox      Version 3.21
 Copyright 2014 Piotr Dollar.  [pdollar-at-gmail.com]
 Licensed under the Simplified BSD License [see external/bsd.txt]

Generated by m2html © 2003