resume

Resume training learners on cross-validation folds

Syntax

ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)

Description

ens1 = resume(ens,nlearn) trains ens in every fold for nlearn more cycles. resume uses the same training options fitcensemble used to create ens.

ens1 = resume(ens,nlearn,Name,Value) trains ens with additional options specified by one or more Name,Value pair arguments.

Input Arguments

ens

A cross-validated classification ensemble. ens is the result of either:

  • The fitcensemble function with a cross-validation name-value pair. The names are 'crossval', 'kfold', 'holdout', 'leaveout', or 'cvpartition'.

  • The crossval method applied to a classification ensemble.

nlearn

A positive integer, the number of cycles for additional training of ens.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'nprint'

Printout frequency, a positive integer scalar or 'off' (no printouts). Returns to the command line the number of weak learners trained so far. Useful when you train ensembles with many learners on large data sets.

Default: 'off'

Output Arguments

ens1

The cross-validated classification ensemble ens, augmented with additional training.

Examples

expand all

Train a partitioned classification ensemble for 10 cycles, and compare the classification loss obtained after training the ensemble for more cycles.

Load the ionosphere data set.

load ionosphere

Train a partitioned classification ensemble for 10 cycles and examine the error.

t = templateTree('MaxNumSplits',1); % Weak learner template tree object
cvens = fitcensemble(X,Y,'Method','GentleBoost','NumLearningCycles',10,'Learners',t,'crossval','on');
rng(10,'twister') % For reproducibility
L = kfoldLoss(cvens)
L = 0.0940

Train for 10 more cycles and examine the new error.

cvens = resume(cvens,10);
L = kfoldLoss(cvens)
L = 0.0712

The cross-validation error is lower in the ensemble after training for 10 more cycles.