Fit posterior probabilities for support vector machine (SVM) classifier
returns a trained support vector machine (SVM) classifier
ScoreSVMModel
= fitPosterior(SVMModel
)ScoreSVMModel
containing the optimal
score-to-posterior-probability transformation function for two-class learning. For
more details, see Algorithms.
[
additionally returns the optimal score-to-posterior-probability transformation
function parameters.ScoreSVMModel
,ScoreTransform
]
= fitPosterior(SVMModel
)
[
uses additional options specified by one or more name-value pair arguments. For
example, you can specify the number of folds or the holdout sample
proportion.ScoreSVMModel
,ScoreTransform
]
= fitPosterior(SVMModel
,Name,Value
)
This process describes one way to predict positive class posterior probabilities.
Train an SVM classifier by passing the data to fitcsvm
. The result is a trained
SVM classifier, such as SVMModel
, that stores the data. The
software sets the score transformation function property
(SVMModel.ScoreTransformation
) to
none
.
Pass the trained SVM classifier SVMModel
to
fitSVMPosterior
or fitPosterior
. The
result, such as, ScoreSVMModel
, is the same trained SVM
classifier as SVMModel
, except the software sets
ScoreSVMModel.ScoreTransformation
to the optimal score
transformation function.
Pass the predictor data matrix and the trained SVM classifier containing the
optimal score transformation function (ScoreSVMModel
) to
predict
. The second column in the
second output argument of predict
stores the positive class
posterior probabilities corresponding to each row of the predictor data
matrix.
If you skip step 2, then predict
returns the positive class
score rather than the positive class posterior probability.
After fitting posterior probabilities, you can generate C/C++ code that predicts labels for new data. Generating C/C++ code requires MATLAB® Coder™. For details, see Introduction to Code Generation.
The software fits the appropriate score-to-posterior-probability transformation
function by using the SVM classifier SVMModel
and by conducting
10-fold cross-validation using the stored predictor data (SVMModel.X
)
and the class labels (SVMModel.Y
), as outlined in [1]. The transformation function computes the posterior probability that an observation
is classified into the positive class (SVMModel.Classnames(2)
).
If the classes are inseparable, then the transformation function is the sigmoid function.
If the classes are perfectly separable, then the transformation function is the step function.
In two-class learning, if one of the two classes has a relative frequency
of 0, then the transformation function is the constant function. The
fitPosterior
function is not appropriate for
one-class learning.
The software stores the optimal score-to-posterior-probability
transformation function in
ScoreSVMModel.ScoreTransform
.
If you re-estimate the score-to-posterior-probability
transformation function, that is, if you pass an SVM classifier to
fitPosterior
or fitSVMPosterior
and its
ScoreTransform
property is not none
, then the software:
Displays a warning
Resets the original transformation function to 'none'
before
estimating the new one
You can also fit the posterior probability function by using fitSVMPosterior
. This function is similar to
fitPosterior
, except it is more broad because it accepts a wider
range of SVM classifier types.
[1] Platt, J. “Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods.” Advances in Large Margin Classifiers. Cambridge, MA: The MIT Press, 2000, pp. 61–74.