updateMetricsAndFit

Update incremental learning model performance metrics on new data, then train model

Description

Given streaming data, updateMetricsAndFit first evaluates the performance of a configured incremental learning model for linear regression (incrementalRegressionLinear object) or linear, binary classification (incrementalClassificationLinear object) by calling updateMetrics on incoming data, and then updateMetricsAndFit fits the model to that data by calling fit. In other words, updateMetricsAndFit performs prequential evaluation because it treats each incoming chunk of data as a test set, and it tracks performance metrics measured cumulatively and over a specified window [1].

updateMetricsAndFit provides a simple way to update model performance metrics and train the model on each chunk of data. In contrast, you can perform the operations separately by calling updateMetrics and then fit instead, which allows for more flexibility (for example, you can decide whether you need to train the model based on its performance on a chunk of data).

example

Mdl = updateMetricsAndFit(Mdl,X,Y) returns an incremental learning model Mdl, which is the input incremental learning model Mdl with the following modifications:

  1. updateMetricsAndFit measures the model performance on the incoming predictor and response data, X and Y, respectively. When the input model is warm (Mdl.IsWarm is true), updateMetricsAndFit overwrites previously computed metrics, stored in the Metrics property, with the new values. Otherwise, updateMetricsAndFit stores NaN values in Metrics instead.

  2. updateMetricsAndFit fits the modified model to the incoming data by following this procedure:

    1. Initialize the solver with the configurations and linear model coefficient and bias estimates of the input model Mdl.

    2. Fit the model to the data, and stores the updated coefficient estimates and configurations in the output model Mdl.

The input and output models are the same data type.

example

Mdl = updateMetricsAndFit(Mdl,X,Y,Name,Value) uses additional options specified by one or more name-value pair arguments. For example, you can specify observation weights or that the columns of the predictor data matrix correspond to observations.

Examples

collapse all

Create a default incremental linear SVM model for binary classification.

Mdl = incrementalClassificationLinear()
Mdl = 
  incrementalClassificationLinear

            IsWarm: 0
           Metrics: [1×2 table]
        ClassNames: [1×0 double]
    ScoreTransform: 'none'
              Beta: [0×1 double]
              Bias: 0
           Learner: 'svm'


  Properties, Methods

Mdl is an incrementalClassificationLinear model. All its properties are read-only.

Mdl must be fit to data before you can perform any other operations using it.

Load the human activity data set. Randomly shuffle the data.

load humanactivity
n = numel(actid);
rng(1); % For reproducibility
idx = randsample(n,n);
X = feat(idx,:);
Y = actid(idx);

For details on the data set, display Description.

Responses can be one of five classes. Dichotomize the response by identifying whether the subject is moving (actid > 2).

Y = Y > 2;

Use updateMetricsAndfit to fit the incremental model to the training data, in chunks of 50 observations at a time, to simulate a data stream. At each iteration:

  • Process 50 observations.

  • Overwrite the previous incremental model with a new one fitted to the incoming observation.

  • Store β1, the cumulative metrics, and the window metrics to monitor their evolution during incremental learning.

% Preallocation
numObsPerChunk = 50;
nchunk = floor(n/numObsPerChunk);
ce = array2table(zeros(nchunk,2),'VariableNames',["Cumulative" "Window"]);
beta1 = zeros(nchunk,1);    

% Incremental fitting
for j = 1:nchunk
    ibegin = min(n,numObsPerChunk*(j-1) + 1);
    iend   = min(n,numObsPerChunk*j);
    idx = ibegin:iend;    
    Mdl = updateMetricsAndFit(Mdl,X(idx,:),Y(idx));
    ce{j,:} = Mdl.Metrics{"ClassificationError",:};
    beta1(j + 1) = Mdl.Beta(1);
end

Mdl is an incrementalClassificationLinear model object that has experienced all the data in the stream. During incremental learning and after the model is warmed up, updateMetricsAndFit checks the performance of the model on the incoming observation, then fits the model to that observation.

To see how the performance metrics and β1 evolved during training, plot them on separate subplots.

figure;
subplot(2,1,1)
plot(beta1)
ylabel('\beta_1')
xlim([0 nchunk]);
xline(Mdl.EstimationPeriod/numObsPerChunk,'r-.');
subplot(2,1,2)
h = plot(ce.Variables);
xlim([0 nchunk]);
ylabel('Classification Error')
xline(Mdl.EstimationPeriod/numObsPerChunk,'r-.');
xline((Mdl.EstimationPeriod + Mdl.MetricsWarmupPeriod)/numObsPerChunk,'g-.');
legend(h,ce.Properties.VariableNames)
xlabel('Iteration')

The plot suggests that updateMetricsAndFit:

  • Fits β1 during all incremental learning iterations

  • Computes performance metrics only after the metrics warm-up period

  • Computes the cumulative metrics during each iteration

  • Computes the window metrics after processing 500 observations

Train a linear regression model by using fitrlinear, convert it to an incremental learner, and then track its performance and fit it to streaming chunks of data. Carry over training options from traditional to incremental learning.

Load and Preprocess Data

Load the 2015 NYC housing data set, and shuffle the data. For more details on the data, see NYC Open Data.

load NYCHousing2015
rng(1); % For reproducibility
n = size(NYCHousing2015,1);
idxshuff = randsample(n,n);
NYCHousing2015 = NYCHousing2015(idxshuff,:);

Suppose that the data collected from Manhattan (BOROUGH = 1) was collected using a new method that doubles its quality. Create a weight variable that attributes observations collected from Manhattan a 2, and all other observations a 1.

n = size(NYCHousing2015,1);
NYCHousing2015.W = ones(n,1) + (NYCHousing2015.BOROUGH == 1);

Extract the response variable SALEPRICE from the table. For numerical stability, scale SALEPRICE by 1e6.

Y = NYCHousing2015.SALEPRICE/1e6;
NYCHousing2015.SALEPRICE = [];

Create dummy variable matrices from the categorical predictors.

catvars = ["BOROUGH" "BUILDINGCLASSCATEGORY" "NEIGHBORHOOD"];
dumvarstbl = varfun(@(x)dummyvar(categorical(x)),NYCHousing2015,...
    'InputVariables',catvars);
dumvarmat = table2array(dumvarstbl);
NYCHousing2015(:,catvars) = [];

Treat all other numeric variables in the table as linear predictors of sales price. Concatenate the matrix of dummy variables to the rest of the predictor data. Transpose the resulting predictor matrix.

idxnum = varfun(@isnumeric,NYCHousing2015,'OutputFormat','uniform');
X = [dumvarmat NYCHousing2015{:,idxnum}]';

Train Linear Regression Model

Fit a linear regression model to a random sample of half the data.

idxtt = randsample([true false],n,true);
TTMdl = fitrlinear(X(:,idxtt),Y(idxtt),'ObservationsIn','columns',...
    'Weights',NYCHousing2015.W(idxtt))
TTMdl = 
  RegressionLinear
         ResponseName: 'Y'
    ResponseTransform: 'none'
                 Beta: [313×1 double]
                 Bias: 0.1116
               Lambda: 2.1977e-05
              Learner: 'svm'


  Properties, Methods

TTMdl is a RegressionLinear model object representing a traditionally trained linear regression model.

Convert Trained Model

Convert the traditionally trained linear regression model to a linear regression model for incremental learning.

IncrementalMdl = incrementalLearner(TTMdl)
IncrementalMdl = 
  incrementalRegressionLinear

               IsWarm: 1
              Metrics: [1×2 table]
    ResponseTransform: 'none'
                 Beta: [313×1 double]
                 Bias: 0.1116
              Learner: 'svm'


  Properties, Methods

Track Performance Metrics and Fit Model

Use the updateMetrics and fit functions to perform incremental learning on the rest of the data. Simulate a data stream by processing 500 observations at a time. At each iteration:

  1. Call updateMetricsAndFit to update the cumulative and window epsilon insensitive loss of the model given the incoming chunk of observations, and then fit the model to the data. Overwrite the previous incremental model to update the losses in the Metrics property. Specify that the observations are oriented in columns, and specify the observation weights.

  2. Store the losses and last estimated coefficient β313.

% Preallocation
idxil = ~idxtt;
nil = sum(idxil);
numObsPerChunk = 500;
nchunk = floor(nil/numObsPerChunk);
ei = array2table(zeros(nchunk,2),'VariableNames',["Cumulative" "Window"]);
beta313 = [IncrementalMdl.Beta(end); zeros(nchunk,1)];
Xil = X(:,idxil);
Yil = Y(idxil);
Wil = NYCHousing2015.W(idxil);

% Incremental fitting
for j = 1:nchunk
    ibegin = min(nil,numObsPerChunk*(j-1) + 1);
    iend   = min(nil,numObsPerChunk*j);
    idx = ibegin:iend;
    IncrementalMdl = updateMetricsAndFit(IncrementalMdl,Xil(:,idx),Yil(idx),...
        'ObservationsIn','columns','Weights',Wil(idx));
    ei{j,:} = IncrementalMdl.Metrics{"EpsilonInsensitiveLoss",:};
    beta313(j + 1) = IncrementalMdl.Beta(end);
end

IncrementalMdl is an incrementalRegressionLinear model object that has experienced all the data in the stream.

Alternatively, you can use updateMetricsAndFit to update performance metrics of the model given a new chunk of data, and then fit the model to the data.

Plot a trace plots of the performance metrics and estimated coefficient β313.

figure;
subplot(2,1,1)
h = plot(ei.Variables);
xlim([0 nchunk]);
ylabel('Epsilon Insensitive Loss')
legend(h,ei.Properties.VariableNames)
subplot(2,1,2)
plot(beta313)
ylabel('\beta_{313}')
xlim([0 nchunk]);
xlabel('Iteration')

The cumulative loss gradually changes with iteration (chunk of 500 observations), whereas the window loss jumps. Because the metrics window is 200 by default, and updateMetrics measures the performance based on the latest 200 observations in each 500 observation chunk.

β313 changes, but levels off quickly, as fit processes chunks of observations.

Input Arguments

collapse all

Incremental learning model whose performance is measured and then is fit to data, specified as a incrementalClassificationLinear or incrementalRegressionLinear model object, created directly or by converting a supported traditionally trained machine learning model using incrementalLearner. For more details, see the reference page corresponding to the learning problem.

If Mdl.IsWarm is false, updateMetricsAndFit does not track the performance of the model. For more details, see Algorithms.

Chunk of predictor data with which to measure the model performance, and then to fit the model to, specified as a floating-point matrix of n observations and Mdl.NumPredictors predictor variables. The value of the 'ObservationsIn' name-value pair argument determines the orientation of the variables and observations.

The length of the observation labels Y and the number of observations in X must be equal; Y(j) is the label of observation (row or column) j in X.

Note

  • If Mdl.NumPredictors = 0, updateMetricsAndFit infers the number of predictors from X, and sets the congruent property of the output model. Otherwise, if the number of predictor variables in the streaming data changes from Mdl.NumPredictors, updateMetricsAndFit issues an error.

  • updateMetricsAndFit supports only floating-point input predictor data. If the input model Mdl represents a converted, traditionally trained model fit to categorical data, use dummyvar to convert each categorical variable to a numeric matrix of dummy variables, and concatenate all dummy variable matrices and any other numeric predictors. For more details, see Dummy Variables.

Data Types: single | double

Chunk of labels with which to measure the model performance, and then to fit the model to, specified as a categorical, character, or string array, logical or floating-point vector, or cell array of character vectors for classification problems, and a floating-point vector for regression problems.

The length of the observation labels Y and the number of observations in X must be equal; Y(j) is the label of observation j (row or column) in X.

For classification problems:

  • updateMetricsAndFit supports binary classification only.

  • When the ClassNames property of the input model Mdl is nonempty, the following conditions apply:

    • If Y contains a label that is not a member of Mdl.ClassNames, updateMetricsAndFit issues an error.

    • The data type of Y and Mdl.ClassNames must be the same.

Data Types: char | string | cell | categorical | logical | single | double

Note

  • If an observation (predictor or label) or weight Weight contains at least one missing (NaN) value, updateMetricsAndFit ignores the observation. Consequently, updateMetricsAndFit uses fewer than n observations to compute the model performance.

  • The chunk size n and the stochastic gradient descent (SGD) hyperparameter batch size (Mdl.BatchSize) can be different values. If n < Mdl.BatchSize, updateMetricsAndFit uses the n available observations when it applies SGD.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

Example: 'ObservationsIn','columns','Weights',W specifies that the columns of the predictor matrix correspond to observations, and the vector W contains observation weights to apply during incremental learning.

Predictor data observation dimension, specified as the comma-separated pair consisting of 'ObservationsIn' and 'columns' or 'rows'.

Chunk of observation weights, specified as the comma-separated pair consisting of 'Weights' and a floating-point vector of positive values. updateMetricsAndFit weighs the observations in X with the corresponding values in Weights. The size of Weights must equal n, which is the number of observations in X.

By default, Weights is ones(n,1).

For more details, including normalization schemes, see Observation Weights.

Data Types: double | single

Output Arguments

collapse all

Updated incremental learning model, returned as an incremental learning model object of the same data type as the input model Mdl, either incrementalClassificationLinear or incrementalRegressionLinear.

When you call updateMetricsAndFit, the following conditions apply:

  • If the model is not warm, updateMetricsAndFit does not compute performance metrics. As a result, the Metrics property of Mdl remains completely composed of NaN values. For more details, see Algorithms.

  • If Mdl.EstimationPeriod > 0, incremental fitting functions updateMetricsAndFit and fit estimate hyperparameters using the first Mdl.EstimationPeriod observations passed to them; they do not train the input model to that data. However, if an incoming chunk of n observations is greater than or equal to the number of observations left in the estimation period m, updateMetricsAndFit estimates hyperparameters using the first nm observations, and fits the input model to the remaining m observations. Consequently, the software updates the Beta and Bias properties, hyperparameter properties, and record-keeping properties such as NumTrainingObservations.

For classification problems, if the ClassNames property of the input model Mdl is an empty array, updateMetricsAndFit sets the ClassNames property of the output model Mdl to unique(Y).

Algorithms

collapse all

Performance Metrics

  • updateMetricsAndFit tracks only model performance metrics, specified by the row labels of the table in Mdl.Metrics, from new data when the incremental model is warm (IsWarm property is true). An incremental model is warm after fit fit the incremental model to Mdl.MetricsWarmupPeriod observations, which is the metrics warm-up period.

    If Mdl.EstimationPeriod > 0, the functions estimate hyperparameters before fitting the model to data, and, therefore, the functions must process an additional EstimationPeriod observations before the model starts the metrics warm-up period.

  • The Metrics property of the incremental model stores two forms of each performance metric as variables (columns) of a table, Cumulative and Window, with individual metrics oriented along rows. When the incremental model is warm, updateMetrics update the metrics at the following frequencies:

    • Cumulative — The functions compute cumulative metrics since the start of model performance tracking. The functions update metrics every time you call the functions and base the calculation on the entire supplied data set.

    • Window — The functions compute metrics based on all observations within a window determined by the Mdl.MetricsWindowSize name-value pair argument. Mdl.MetricsWindowSize also determines the frequency at which the software updates Window metrics. For example, if Mdl.MetricsWindowSize is 20, the functions compute metrics based on the last 20 observations in the supplied data (X((end - 20 + 1):end,:) and Y((end - 20 + 1):end)).

      Incremental functions that track performance metrics within a window using the following process:

      1. For each specified metric, store a buffer of length Mdl.MetricsWindowSize and a buffer of observation weights.

      2. Populate elements of the metrics buffer with the model performance based on batches of incoming observations, and store corresponding observations weights in the weights buffer.

      3. When the buffer is filled, overwrite Mdl.Metrics.Window with the weighted average performance in the metrics window. If the buffer is overfilled when the function processes a batch of observations, the latest, incoming Mdl.MetricsWindowSize observations enter the buffer, and the earliest observations are removed from the buffer. For example, suppose Mdl.MetricsWindowSize is 20, the metrics buffer has 10 values from a previously processed batch, and 15 values are incoming. To compose the length 20 window, the functions use the measurements from the 15 incoming observations and the latest 5 measurements from the previous batch.

Observation Weights

For classification problems, if the prior class probability distribution is known (Mdl.Prior is not composed of NaN values), updateMetricsAndFit normalizes observation weights to sum to the prior class probabilities in the respective classes. This action implies that observation weights are the respective prior class probabilities by default.

For regression problems or if the prior class probability distribution is unknown, the software normalizes the specified observation weights to sum to 1 each time you call updateMetricsAndFit.

References

[1] Bifet, Albert, Ricard Gavaldá, Geoffrey Holmes, and Bernhard Pfahringer. Machine Learning for Data Streams with Practical Example in MOA. Cambridge, MA: The MIT Press, 2007.

Introduced in R2020b