perClass Documentation
version 5.1 (31-May-2017)
 SDSVC  Support vector machine (trained by libSVM)

      P=SDSVC(DATA,options)
      [P,E]=SDSVC(DATA,options)

 INPUT
   DATA      SDDATA object

 OUTPUT
   P         Pipeline object
   E         Structure with grid search results

 OPTIONS
  'type'     kernel type: 'RBF','poly','linear' (default: RBF)
  'sigma'    RBF sigma (default: select by grid-search)
  'degree'   Polynomial degree (default: select by grid-search)
  'C'        cost parameter C (default: select by grid-search)
  'noscale'  Do not include data scaling
  'test'     Provide external sddata for evaluating error in parameter search
  'tsfrac'   If 'test' is not specified, fraction of DATA selected
             randomly per class for evaluating error criterion (def: 0.25)
  'one-against-one'  Use one-against-one multi-class strategy instead of
                     default one-against-all
  'verbose'  Show verbose output of libsvm
  'no shrink' Disable libsvm shrinking heuristic (may speedup optimization)
  'prob'      Return probabilistic soft outputs (for two-class case only)

 DESCRIPTION
 SDSVC trains a support vector machine using libSVM.  By default, RBF
 kernel is used with sigma and C parameters optimized using grid search
 minimizing mean error. Polynomial and linear SVM is available using
 'type' option. For multi-class problems, one-against-all strategy is
 adopted by default. It is possible to use one-against-one strategy using
 the identically-named option.
 By default, for RBF and polynomial kernel, sdsvc scales data
 (standardization). Scaling may be switched off using 'noscale' option.
 sdsvc is splitting the DATA into a subset used for training the model and
 a subset used for error estimatiom/parameter selection (by default 25% od
 DATA). This fraction may be adjusted by 'tsfrac' option. Alternatively,
 the user may provide external set for error estimation using 'test'
 option.
 Trained SDSVC pipeline provides access to the kernel parameter
 (sigma/degree), C constant used in training, number of support vectors
 (svcount) and indices of support vectors in the training set (svind).

 Probabilistic soft outputs are available for two-class classifiers with
 'prob' option (Platt 2000, improved by Lin et al., see
 https://www.csie.ntu.edu.tw/~cjlin/papers/svmprob/svmprob.pdf)

 EXAMPLES
   p=sdsvc(b)
   p'
   origSV=b( p(2).svind ) % p(2) because the first step is scaling

 READ MORE
http://perclass.com/doc/guide/classifiers.html#sdsvc

 REFERENCE
 Chih-Chung Chang and Chih-Jen Lin, LIBSVM : a library for support vector
 machines, 2001.http://www.csie.ntu.edu.tw/~cjlin/libsvm

 J.Platt, Probabilistic outputs for support vector machines and comparison
 to regularized likelihood methods, Advances in Large Margin Classifiers,
 Cambridge, MA, 2000,http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1639

sdsvc is referenced in examples: