Experiment Manager

Design and run experiments to train and compare deep learning networks

Description

The Experiment Manager app enables you to create a deep learning experiment to train networks under various initial conditions and compare the results. For example, you can use deep learning experiments to:

  • Sweep through a range of hyperparameter values to train a deep network.

  • Compare the results of using different data sets to train a network.

  • Test different deep network architectures by reusing the same set of training data on several networks.

Experiment Manager provides visualization tools such as training plots and confusion matrices, filters to refine your experiment results, and the ability to define custom metrics to evaluate your results. To improve reproducibility, every time that you run an experiment, Experiment Manager stores a copy of the experiment definition. You can access past experiment definitions to keep track of the hyperparameter combinations that produce each of your results.

Experiment Manager organizes your experiments and results in a project (MATLAB).

  • You can store several experiments in the same project.

  • Each experiment contains a set of results for each time that you run the experiment.

  • Each set of results consists of one or more trials corresponding to a different combination of hyperparameters.

The Experiment Browser pane displays the hierarchy of experiments and results in the project. For instance, this project has two experiments, each of which has several sets of results. To open the configuration for an experiment and view its results, double-click the name of an experiment or a set of results.

Experiment Browser showing two experiments. Experiment1 has four results. Experiment2
            has two results.

Experiment Manager app

Open the Experiment Manager App

  • MATLAB® Toolstrip: On the Apps tab, under Machine Learning and Deep Learning, click the app icon.

  • MATLAB command prompt: Enter experimentManager.

Examples

expand all

This example shows how to use the default experiment setup function to train an image classification network by sweeping hyperparameters. The experiment uses the Digits data set. For more information on this data set, see Image Data Sets.

Open the example to load a project with a preconfigured experiment that you can inspect and run. To open the experiment, in the Experiment Browser, double-click the name of the experiment (Experiment1).

Alternatively, you can configure the experiment yourself by following these steps.

1. Open Experiment Manager.

2. Click New > Project and select the location and name for a new project. Experiment Manager opens a new experiment in the project. The Experiment pane displays the description, hyperparameter table, setup function, and metrics that define the experiment.

3. In the Description box, enter a description of the experiment:

Classification of digits, using various initial learning rates.

4. In the Hyperparameter Table, replace the value of myInitialLearnRate with 0.0025:0.0025:0.015.

5. Under Setup Function, click Edit. The setup function opens in MATLAB Editor. The setup function specifies the training data, network architecture, and training options for the experiment. By default, the template for the setup function has three sections.

  • Load Image Data defines image datastores containing the training and validation data for the experiment. The data consists of 10,000 28-by-28 pixel grayscale images of digits from 0 to 9, categorized by the digit they represent.

  • Define Network Architecture defines the architecture for a simple convolutional neural network for deep learning classification.

  • Specify Training Options defines a trainingOptions object for the experiment. By default, the template loads the values for the training option 'InitialLearnRate' from the myInitialLearnRate entry in the hyperparameter table.

In Experiment Manager, click Run. Experiment Manager trains the network defined by the setup function six times. Each trial uses one of the learning rates specified in the hyperparameter table. A table of results displays the accuracy and loss for each trial.

While the experiment is running, click Training Plot to display the training plot and track the progress of each trial. You can also monitor the training progress in the MATLAB Command Window.

Click Confusion Matrix to display the confusion matrix for the validation data in each completed trial.

When the experiment finishes, you can sort the table by column or filter trials by using the Filters pane. For more information, see Sort and Filter Experiment Results.

To test the performance of an individual trial, export the trained network or the training information for the trial. On the Experiment Manager tab, select Export > Trained Network or Export > Training Information, respectively. For more information, see Output Arguments.

To close the experiment, in the Experiment Browser, right-click the name of the project and select Close Project. Experiment Manager saves your results and closes all of the experiments contained in the project.

This example shows how to configure an experiment to train an image regression network by sweeping hyperparameters. The experiment uses the Digits data set. For more information on this data set, see Image Data Sets.

Open the example to load a project with a preconfigured experiment that you can inspect and run. To open the experiment, in the Experiment Browser, double-click the name of the experiment (Experiment1).

Alternatively, you can configure the experiment yourself by following these steps.

1. Open Experiment Manager.

2. Click New > Project and select the location and name for a new project. Experiment Manager opens a new experiment in the project. The Experiment pane displays the description, hyperparameter table, setup function, and metrics that define the experiment.

3. In the Description box, enter a description of the experiment:

Regression model to predict angles of rotation of digits,
using various initial learning rates.

4. In the Hyperparameter Table, replace the value of myInitialLearnRate with 0.001:0.001:0.006.

5. Under Setup Function, click Edit. The setup function opens in MATLAB Editor.

  • Modify the setup function signature to return four outputs. These outputs are used to call the trainNetwork function to train a network for image regression problems.

function [XTrain,YTrain,layers,options] = Experiment1_setup1(params)
  • Modify the Load Image Data section of the setup function to define the training and validation data for the experiment as 4-D arrays. The training and validation data sets each contain 5000 images of digits from 0 to 9. The regression values correspond to the angles of rotation of the digits. Be sure to delete all of the existing code in this section of the setup function.

[XTrain,~,YTrain] = digitTrain4DArrayData;
[XValidation,~,YValidation] = digitTest4DArrayData;
  • Modify the Define Network Architecture section of the setup function to define a convolutional neural network for regression. Be sure to delete all of the existing code in this section of the setup function.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(3,8,'Padding','same')
    batchNormalizationLayer
    reluLayer
    averagePooling2dLayer(2,'Stride',2)
    convolution2dLayer(3,16,'Padding','same')
    batchNormalizationLayer
    reluLayer
    averagePooling2dLayer(2,'Stride',2)
    convolution2dLayer(3,32,'Padding','same')
    batchNormalizationLayer
    reluLayer
    convolution2dLayer(3,32,'Padding','same')
    batchNormalizationLayer
    reluLayer
    dropoutLayer(0.2)
    fullyConnectedLayer(1)
    regressionLayer];
  • Modify the Specify Training Options section of the setup function to use the validation data in the 4-D arrays XValidation and YValidation. This section of the setup function loads the values for the training option 'InitialLearnRate' from the myInitialLearnRate entry in the hyperparameter table.

options = trainingOptions('sgdm', ...
    'MaxEpochs',5, ...
    'ValidationData',{XValidation,YValidation}, ...
    'ValidationFrequency',30, ...
    'InitialLearnRate',params.myInitialLearnRate);

In Experiment Manager, click Run. Experiment Manager trains the network defined by the setup function six times. Each trial uses one of the learning rates specified in the hyperparameter table. A table of results displays the root mean squared error (RMSE) and loss for each trial.

While the experiment is running, click Training Plot to display the training plot and track the progress of each trial. You can also monitor the training progress in the MATLAB Command Window.

When the experiment finishes, you can sort the table by column or filter trials by using the Filters pane. For more information, see Sort and Filter Experiment Results.

To test the performance of an individual trial, export the trained network or the training information for the trial. On the Experiment Manager tab, select Export > Trained Network or Export > Training Information, respectively. For more information, see Output Arguments.

To close the experiment, in the Experiment Browser, right-click the name of the project and select Close Project. Experiment Manager saves your results and closes all of the experiments contained in the project.

This example shows how to set up an experiment using the Experiment Manager app.

Experiment definitions consist of a description, a hyperparameter table, a setup function, and (optionally) a collection of metric functions to evaluate the results of the experiment.

In the Description box, enter a description of the experiment.

In the Hyperparameter Table, specify names and values of the hyperparameters used in the experiment. When you run the experiment, Experiment Manager sweeps through the hyperparameter values and trains the network multiple times. Each trial uses a different combination of the hyperparameter values specified in the table. Specify hyperparameters as scalars or vectors with numeric, logical, or string values. For example, these are valid hyperparameter specifications:

  • 0.01

  • 0.01:0.01:0.05

  • [0.01 0.02 0.04 0.08]

  • ["sgdm" "rmsprop" "adam"]

The Setup Function configures the training data, network architecture, and training options for the experiment. The input to the setup function is a struct with fields from the hyperparameter table. The output of the setup function must match the input of the trainNetwork function. This table lists the supported signatures for the setup function.

Goal of ExperimentSetup Function Signature
Train a network for image classification problems using the image datastore imds to store the input image data.
function [imds,layers,options] = Experiment_setup(params)
...
end
Train a network using the datastore ds.
function [ds,layers,options] = Experiment_setup(params)
...
end
Train a network for image classification and regression problems using the numeric arrays X to store the predictor variables and Y to store the categorical labels or numeric responses.
function [X,Y,layers,options] = Experiment_setup(params)
...
end
Train a network for sequence classification and regression problems using sequences to store sequence or time-series predictors and Y to store the responses.
function [sequences,Y,layers,options] = Experiment_setup(params)
...
end
Train a network for classification and regression problems using the table tbl to store numeric data or file paths to the data.
function [tbl,layers,options] = Experiment_setup(params)
...
end
Train a network for classification and regression problems using responseName to specify the response variables in tbl.
function [tbl,responseName,layers,options] = Experiment_setup(params)
...
end

Note

Experiment Manager does not support the 'multi-gpu' or 'parallel' values for the training option 'ExecutionEnvironment'.

The Metrics section specifies functions to evaluate the results of the experiment. The input to a metric function is a struct with three fields:

The output of a metric function must be a scalar number, a logical value, or a string.

This example shows how to compare the results of running an experiment.

When you run an experiment, Experiment Manager trains the network defined by the setup function multiple times. Each trial uses a different combination of hyperparameters. When the experiment finishes, a table displays training and validation metrics (such as accuracy, RMSE, and loss) for each trial. To compare the results of an experiment, you can use the training and validation metrics to sort the results table and filter trials.

To sort the trials in the results table, use the drop-down menu for the column corresponding to a training or validation metric.

  1. Point to the header of a column by which you want to sort.

  2. Click the triangle icon.

  3. Select Sort in Ascending Order or Sort in Descending Order.

    Results table showing drop down menu for the Validation Accuracy
                      column.

To filter trials from the results table, use the Filters pane.

  1. On the Experiment Manager tab, select Filters.

    The Filters pane shows histograms for the numeric metrics in the results table. To remove a histogram from the Filters pane, in the results table, open the drop-down menu for the corresponding column and clear the Show Filter check box.

  2. Adjust the sliders under the histogram for the training or validation metric by which you want to filter.

    Histogram for Validation Loss, with filter sliders set to 1.45 and
                      1.55.

    The results table shows only the trials with a metric value in the selected range.

    Results table showing only trials with Validation Loss between 1.45 and
                      1.55.

  3. To restore all of the trials in the results table, close the Experiment Result pane and reopen the results from the Experiment Browser.

This example shows how to inspect the configuration of an experiment that produced a given result.

The Experiment Source pane contains a read-only copy of the experiment description and hyperparameter table, as well as links to the setup and metric functions called by the experiment. You can use the information in this pane to track the configuration of data, network, and training options that produce each of your results.

For instance, suppose that you run an experiment multiple times. Each time that you run the experiment, you change the contents of the setup function but always use the same name. The first time that you run the experiment, you use the default classification network provided by the setup function template. The second time that you run the experiment, you modify the setup function to load a pretrained GoogLeNet network, replacing the final layers with new layers for transfer learning. For an example that uses these two network architectures, see Create a Deep Learning Experiment for Classification.

On the first Experiment Result pane, click the View Experiment Source link. Experiment Manager opens an Experiment Source pane that contains the experiment definition that produced the first set of results. Click the link at the bottom of the pane to open the setup function that you used the first time you ran the experiment. You can copy this setup function to rerun the experiment using a simple classification network.

On the second Experiment Result pane, click the View Experiment Source link. Experiment Manager opens an Experiment Source pane that contains the experiment definition that produced the second set of results. Click the link at the bottom of the pane to open the setup function that you used the second time you ran the experiment. You can copy this setup function to rerun the experiment using transfer learning.

Experiment Manager stores a copy of the setup and custom metric functions that you use, so you do not have to manually rename these functions when you modify and rerun an experiment.

Tips

To visualize, build, and train a network without sweeping hyperparameters, try the Deep Network Designer app.

Introduced in R2020a