Train random fern classifier.


function [ferns,hsPr] = fernsClfTrain( data, hs, varargin )


 Train random fern classifier.

 See "Fast Keypoint Recognition in Ten Lines of Code" by Mustafa Ozuysal,
 Pascal Fua and Vincent Lepetit, CVPR07.

  M - number ferns
  S - fern depth
  F - number features
  N - number input vectors
  H - number classes

  [ferns,hsPr] = fernsClfTrain( data, hs, [varargin] )

  data     - [NxF] N length F feature vectors
  hs       - [Nx1] target output labels in [1,H]
  varargin - additional params (struct or name/value pairs)
   .S        - [10] fern depth (ferns are exponential in S)
   .M        - [50] number of ferns to train
   .thrr     - [0 1] range for randomly generated thresholds
   .bayes    - [1] if true combine probs using bayes assumption
   .ferns    - [] if given reuse previous ferns (recompute pFern)

  ferns    - learned fern model w the following fields
   .fids     - [MxS] feature ids for each fern for each depth
   .thrs     - [MxS] threshold corresponding to each fid
   .pFern    - [2^SxHxM] learned log probs at fern leaves
   .bayes    - if true combine probs using bayes assumption
   .inds     - [NxM] cached indices for original training data
   .H        - number classes
  hsPr     - [Nx1] predicted output labels

  N=5000; H=5; d=2; [xs0,hs0,xs1,hs1]=demoGenData(N,N,H,d,1,1);
  fernPrm=struct('S',4,'M',50,'thrr',[-1 1],'bayes',1);
  tic, [ferns,hsPr0]=fernsClfTrain(xs0,hs0,fernPrm); toc
  tic, hsPr1 = fernsClfApply( xs1, ferns ); toc
  e0=mean(hsPr0~=hs0); e1=mean(hsPr1~=hs1);
  fprintf('errors trn=%f tst=%f\n',e0,e1); figure(1);
  subplot(2,2,1); visualizeData(xs0,2,hs0);
  subplot(2,2,2); visualizeData(xs0,2,hsPr0);
  subplot(2,2,3); visualizeData(xs1,2,hs1);
  subplot(2,2,4); visualizeData(xs1,2,hsPr1);

 See also fernsClfApply, fernsInds

 Piotr's Computer Vision Matlab Toolbox      Version 2.61
 Copyright 2014 Piotr Dollar.  [pdollar-at-gmail.com]
 Licensed under the Simplified BSD License [see external/bsd.txt]

Generated by m2html © 2003