resume

Resume training ensemble

Syntax

ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)

Description

ens1 = resume(ens,nlearn) trains ens in every fold for nlearn more cycles. resume uses the same training options fitrensemble used to create ens.

ens1 = resume(ens,nlearn,Name,Value) trains ens with additional options specified by one or more Name,Value pair arguments.

Input Arguments

ens

A cross-validated regression ensemble. ens is the result of either:

  • The fitrensemble function with a cross-validation name-value pair. The names are 'crossval', 'kfold', 'holdout', 'leaveout', or 'cvpartition'.

  • The crossval method applied to a regression ensemble.

nlearn

A positive integer, the number of cycles for additional training of ens.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'nprint'

Printout frequency, a positive integer scalar or 'off' (no printouts). Returns to the command line the number of weak learners trained so far. Useful when you train ensembles with many learners on large data sets.

Default: 'off'

Output Arguments

ens1

The cross-validated regression ensemble ens, augmented with additional training.

Examples

expand all

Examine the cross-validation error after training a regression ensemble for more cycles.

Load the carsmall data set and select displacement, horsepower, and vehicle weight as predictors.

load carsmall
X = [Displacement Horsepower Weight];

Train a regression ensemble for 50 cycles.

ens = fitrensemble(X,MPG,'NumLearningCycles',50); 

Cross-validate the ensemble and examine the cross-validation error.

rng(10,'twister') % For reproducibility
cvens = crossval(ens);
L = kfoldLoss(cvens)
L = 27.9435

Train for 50 more cycles and examine the new cross-validation error.

cvens = resume(cvens,50);
L = kfoldLoss(cvens)
L = 28.7114

The additional training did not improve the cross-validation error.