Cross-validation loss of partitioned regression model
L = kfoldLoss(cvmodel)
L = kfoldLoss(cvmodel,Name,Value)
returns
the cross-validation loss of L
= kfoldLoss(cvmodel
)cvmodel
.
returns
cross-validation loss with additional options specified by one or
more L
= kfoldLoss(cvmodel
,Name,Value
)Name,Value
pair arguments. You can specify
several name-value pair arguments in any order as Name1,Value1,…,NameN,ValueN
.
|
Object of class |
Specify optional
comma-separated pairs of Name,Value
arguments. Name
is
the argument name and Value
is the corresponding value.
Name
must appear inside quotes. You can specify several name and value
pair arguments in any order as
Name1,Value1,...,NameN,ValueN
.
|
Indices of folds ranging from Default: |
|
Function handle for loss function or fun(Y,Yfit,W) where
The returned value Default: |
|
One of the following:
Default: |
|
The loss (mean squared error) between the observations in a
fold when compared against predictions made with a tree trained on
the out-of-fold data. If |
Construct a partitioned regression model, and examine the cross-validation losses for the folds:
load carsmall XX = [Cylinders Displacement Horsepower Weight]; YY = MPG; cvmodel = fitrtree(XX,YY,'crossval','on'); L = kfoldLoss(cvmodel,'mode','individual') L = 44.9635 11.8525 18.2046 9.2965 29.4329 54.8659 24.6446 8.2085 19.7593 16.7394
You can avoid constructing a cross-validated tree model by calling cvloss
. The cross-validated tree can save time if you are going to examine
it more than once.
fitrtree
| kfoldPredict
| loss