Design and run experiments to train and compare deep learning networks
The Experiment Manager app enables you to create a deep learning experiment to train networks under various initial conditions and compare the results. For example, you can use deep learning experiments to:
Sweep through a range of hyperparameter values or use Bayesian optimization to find optimal training options. Bayesian optimization requires Statistics and Machine Learning Toolbox™.
Compare the results of using different data sets to train a network.
Test different deep network architectures by reusing the same set of training data on several networks.
Experiment Manager provides visualization tools such as training plots and confusion matrices, filters to refine your experiment results, and the ability to define custom metrics to evaluate your results. To improve reproducibility, every time that you run an experiment, Experiment Manager stores a copy of the experiment definition. You can access past experiment definitions to keep track of the hyperparameter combinations that produce each of your results.
Experiment Manager organizes your experiments and results in a project.
You can store several experiments in the same project.
Each experiment contains a set of results for each time that you run the experiment.
Each set of results consists of one or more trials corresponding to a different combination of hyperparameters.
By default, Experiment Manager runs one trial at a time. If you have Parallel Computing Toolbox™, you can configure your experiment to run multiple trials simultaneously. Running an experiment in parallel allows you to use MATLAB® while the training is in progress.
The Experiment Browser pane displays the hierarchy of experiments and results in the project. For instance, this project has two experiments, each of which has several sets of results. To open the configuration for an experiment and view its results, double-click the name of an experiment or a set of results.
MATLAB Toolstrip: On the Apps tab, under Machine Learning and Deep Learning, click the app icon.
MATLAB command prompt: Enter experimentManager
.
This example shows how to use the default experiment setup function to train an image classification network by sweeping hyperparameters. For more examples of solving image classification problems with Experiment Manager, see Create a Deep Learning Experiment for Classification and Use Experiment Manager to Train Networks in Parallel. For more information on an alternative strategy to sweeping hyperparameters, see Tune Experiment Hyperparameters by Using Bayesian Optimization.
Open the example to load a project with a preconfigured experiment that you can inspect and run. To open the experiment, in the Experiment Browser pane, double-click the name of the experiment (Experiment1
).
Alternatively, you can configure the experiment yourself by following these steps.
1. Open Experiment Manager.
2. Click New > Project and select the location and name for a new project. Experiment Manager opens a new experiment in the project. The Experiment pane displays the description, hyperparameters, setup function, and metrics that define the experiment.
3. In the Description box, enter a description of the experiment:
Classification of digits, using various initial learning rates.
4. Under Hyperparameters, replace the value of myInitialLearnRate
with 0.0025:0.0025:0.015
. Verify that Strategy is set to Exhaustive Sweep
.
5. Under Setup Function, click Edit. The setup function opens in MATLAB Editor. The setup function specifies the training data, network architecture, and training options for the experiment. By default, the template for the setup function has three sections.
Load Image Data defines image datastores containing the training and validation data for the experiment. The experiment uses the Digits data set, which consists of 10,000 28-by-28 pixel grayscale images of digits from 0 to 9, categorized by the digit they represent. For more information on this data set, see Image Data Sets.
Define Network Architecture defines the architecture for a simple convolutional neural network for deep learning classification.
Specify Training Options defines a
object for the experiment. By default, the template loads the values for the training option trainingOptions
'InitialLearnRate'
from the myInitialLearnRate
entry in the hyperparameter table.
When you run the experiment, Experiment Manager trains the network defined by the setup function six times. Each trial uses one of the learning rates specified in the hyperparameter table. By default, Experiment Manager runs one trial at a time. If you have Parallel Computing Toolbox, you can run multiple trials at the same time. For best results, before you run your experiment, start a parallel pool with as many workers as GPUs. For more information, see Use Experiment Manager to Train Networks in Parallel.
To run one trial of the experiment at a time, in the Experiment Manager toolstrip, click Run.
To run multiple trials at the same time, click Use Parallel and then Run. If there is no current parallel pool, Experiment Manager starts one using the default cluster profile. Experiment Manager then executes multiple simultaneous trials, depending on the number of parallel workers available.
A table of results displays the accuracy and loss for each trial.
While the experiment is running, click Training Plot to display the training plot and track the progress of each trial. You can also monitor the training progress in the MATLAB Command Window.
Click Confusion Matrix to display the confusion matrix for the validation data in each completed trial.
When the experiment finishes, you can sort the table by column or filter trials by using the Filters pane. For more information, see Sort and Filter Experiment Results.
To test the performance of an individual trial, export the trained network or the training information for the trial. On the Experiment Manager toolstrip, select Export > Trained Network or Export > Training Information, respectively. For more information, see net and info.
To close the experiment, in the Experiment Browser pane, right-click the name of the project and select Close Project. Experiment Manager closes all of the experiments and results contained in the project.
This example shows how to configure an experiment to train an image regression network by sweeping hyperparameters. For another example of solving a regression problem with Experiment Manager, see Create a Deep Learning Experiment for Regression.
Open the example to load a project with a preconfigured experiment that you can inspect and run. To open the experiment, in the Experiment Browser pane, double-click the name of the experiment (Experiment1
).
Alternatively, you can configure the experiment yourself by following these steps.
1. Open Experiment Manager.
2. Click New > Project and select the location and name for a new project. Experiment Manager opens a new experiment in the project. The Experiment pane displays the description, hyperparameters, setup function, and metrics that define the experiment.
3. In the Description box, enter a description of the experiment:
Regression to predict angles of rotation of digits, using various initial learning rates.
4. Under Hyperparameters, replace the value of myInitialLearnRate
with 0.001:0.001:0.006
. Verify that Strategy is set to Exhaustive Sweep
.
5. Under Setup Function, click Edit. The setup function opens in MATLAB Editor.
Modify the setup function signature to return four outputs. These outputs are used to call the
function to train a network for image regression problems.trainNetwork
function [XTrain,YTrain,layers,options] = Experiment1_setup1(params)
Modify the Load Image Data section of the setup function to define the training and validation data for the experiment as 4-D arrays. In this experiment, the training and validation data each consist of 5000 images from the Digits data set. Each image shows a digit from 0 to 9, rotated by a certain angle. The regression values correspond to the angles of rotation. For more information on this data set, see Image Data Sets. Be sure to delete all of the existing code in this section of the setup function.
[XTrain,~,YTrain] = digitTrain4DArrayData; [XValidation,~,YValidation] = digitTest4DArrayData;
Modify the Define Network Architecture section of the setup function to define a convolutional neural network for regression. Be sure to delete all of the existing code in this section of the setup function.
layers = [ imageInputLayer([28 28 1]) convolution2dLayer(3,8,'Padding','same') batchNormalizationLayer reluLayer averagePooling2dLayer(2,'Stride',2) convolution2dLayer(3,16,'Padding','same') batchNormalizationLayer reluLayer averagePooling2dLayer(2,'Stride',2) convolution2dLayer(3,32,'Padding','same') batchNormalizationLayer reluLayer convolution2dLayer(3,32,'Padding','same') batchNormalizationLayer reluLayer dropoutLayer(0.2) fullyConnectedLayer(1) regressionLayer];
Modify the Specify Training Options section of the setup function to use the validation data in the 4-D arrays XValidation
and YValidation
. This section of the setup function loads the values for the training option 'InitialLearnRate'
from the myInitialLearnRate
entry in the hyperparameter table.
options = trainingOptions('sgdm', ... 'MaxEpochs',5, ... 'ValidationData',{XValidation,YValidation}, ... 'ValidationFrequency',30, ... 'InitialLearnRate',params.myInitialLearnRate);
When you run the experiment, Experiment Manager trains the network defined by the setup function six times. Each trial uses one of the learning rates specified in the hyperparameter table. Each trial uses one of the learning rates specified in the hyperparameter table. By default, Experiment Manager runs one trial at a time. If you have Parallel Computing Toolbox, you can run multiple trials at the same time. For best results, before you run your experiment, start a parallel pool with as many workers as GPUs. For more information, see Use Experiment Manager to Train Networks in Parallel.
To run one trial of the experiment at a time, in the Experiment Manager toolstrip, click Run.
To run multiple trials at the same time, click Use Parallel and then Run. If there is no current parallel pool, Experiment Manager starts one using the default cluster profile. Experiment Manager then executes multiple simultaneous trials, depending on the number of parallel workers available.
A table of results displays the root mean squared error (RMSE) and loss for each trial.
While the experiment is running, click Training Plot to display the training plot and track the progress of each trial. You can also monitor the training progress in the MATLAB Command Window.
When the experiment finishes, you can sort the table by column or filter trials by using the Filters pane. For more information, see Sort and Filter Experiment Results.
To test the performance of an individual trial, export the trained network or the training information for the trial. On the Experiment Manager tab, select Export > Trained Network or Export > Training Information, respectively. For more information, see net and info.
To close the experiment, in the Experiment Browser pane, right-click the name of the project and select Close Project. Experiment Manager closes all of the experiments and results contained in the project.
This example shows how to set up an experiment using the Experiment Manager app.
Experiment definitions consist of a description, a table of hyperparameters, a setup function, and (optionally) a collection of metric functions to evaluate the results of the experiment.
In the Description box, enter a description of the experiment.
Under Hyperparameters, select the strategy to use for your experiment.
To sweep through a range of hyperparameter values, set
Strategy to Exhaustive Sweep
. In
the hyperparameter table, specify the values of the hyperparameters used in the
experiment. You can specify hyperparameter values as scalars or vectors with
numeric, logical, or string values. For example, these are valid hyperparameter specifications:
0.01
0.01:0.01:0.05
[0.01 0.02 0.04 0.08]
["sgdm" "rmsprop" "adam"]
When you run the experiment, Experiment Manager trains the network using every combination of the hyperparameter values specified in the table.
To use Bayesian optimization to find optimal training options, set
Strategy to Bayesian Optimization
.
In the hyperparameter table, specify these properties of the hyperparameters used in
the experiment:
Range — Enter a two-element vector that gives the lower bound and upper bound of a real- or integer-valued hyperparameter, or a string array or cell array that lists the possible values of a categorical hyperparameter.
Type — Select real
(real-valued hyperparameter), integer
(integer-valued hyperparameter), or categorical
(categorical hyperparameter).
Transform — Select none
(no transform) or log
(logarithmic transform). For
log
, the hyperparameter must be
real
or integer
and
positive. The hyperparameter is searched and modeled on a logarithmic
scale.
When you run the experiment, Experiment Manager searches for the best combination of hyperparameters. Each trial in the experiment uses a new combination of hyperparameter values based on the results of the previous trials. To specify the duration of your experiment, under Bayesian Optimization Options, enter the maximum time (in seconds) and the maximum number of trials to run. Bayesian optimization requires Statistics and Machine Learning Toolbox. For more information, see Tune Experiment Hyperparameters by Using Bayesian Optimization.
The Setup Function
configures the training data, network architecture, and training options for the
experiment. The input to the setup function is a struct
with fields
from the hyperparameter table. The output of the setup function must match the input of
the trainNetwork
function. This table lists the
supported signatures for the setup function.
Goal of Experiment | Setup Function Signature |
---|---|
Train a network for image classification problems using the image datastore
imds to store the input image data. |
function [imds,layers,options] = Experiment_setup(params) ... end |
Train a network using the datastore ds . |
function [ds,layers,options] = Experiment_setup(params) ... end |
Train a network for image classification and regression problems using the
numeric arrays X to store the predictor variables and
Y to store the categorical labels or numeric
responses. |
function [X,Y,layers,options] = Experiment_setup(params) ... end |
Train a network for sequence classification and regression problems using
sequences to store sequence or time-series predictors and
Y to store the responses. |
function [sequences,Y,layers,options] = Experiment_setup(params) ... end |
Train a network for classification and regression problems using the table
tbl to store numeric data or file paths to the
data. |
function [tbl,layers,options] = Experiment_setup(params) ... end |
Train a network for classification and regression problems using
responseNames to specify the response variables in
tbl . |
function [tbl,responseNames,layers,options] = Experiment_setup(params) ... end |
Note
Experiment Manager does not support parallel execution when you set the training
option 'ExecutionEnvironment'
to
'multi-gpu'
or 'parallel'
or enable the
training option 'DispatchInBackground'
. For more
information, see Use Experiment Manager to Train Networks in Parallel.
The Metrics section
specifies functions to evaluate the results of the experiment. The input to a metric
function is a struct
with three fields:
trainedNetwork
is the SeriesNetwork
object or DAGNetwork
object returned by the trainNetwork
function. For more information, see Trained Network.
trainingInfo
is a struct
containing the
training information returned by the trainNetwork
function. For more information, see Training Information.
parameters
is a struct
with fields from
the hyperparameter table.
The output of a metric function must be a scalar number, a logical value, or a string.
If your experiment uses Bayesian optimization, select a metric to optimize from the
Optimize list. In the Direction list,
specify that you want to Maximize
or
Minimize
this metric. Experiment Manager uses this metric
to determine the best combination of hyperparameters for your experiment. You can choose
a standard training or validation metric (such as accuracy, RMSE, or loss) or a custom
metric from the table.
This example shows how to compare the results of running an experiment.
When you run an experiment, Experiment Manager trains the network defined by the setup function multiple times. Each trial uses a different combination of hyperparameters. When the experiment finishes, a table displays training and validation metrics (such as accuracy, RMSE, and loss) for each trial. To compare the results of an experiment, you can use the training and validation metrics to sort the results table and filter trials.
To sort the trials in the results table, use the drop-down menu for the column corresponding to a training or validation metric.
Point to the header of a column by which you want to sort.
Click the triangle icon.
Select Sort in Ascending Order or Sort in Descending Order.
To filter trials from the results table, use the Filters pane.
On the Experiment Manager toolstrip, select Filters.
The Filters pane shows histograms for the numeric metrics in the results table. To remove a histogram from the Filters pane, in the results table, open the drop-down menu for the corresponding column and clear the Show Filter check box.
Adjust the sliders under the histogram for the training or validation metric by which you want to filter.
The results table shows only the trials with a metric value in the selected range.
To restore all of the trials in the results table, close the Experiment Result pane and reopen the results from the Experiment Browser pane.
This example shows how to inspect the configuration of an experiment that produced a given result.
After you run an experiment, you can open the Experiment Source pane to see a read-only copy of the experiment description and hyperparameter table, as well as links to the setup and metric functions called by the experiment. You can use the information in this pane to track the configuration of data, network, and training options that produce each of your results.
For instance, suppose that you run an experiment multiple times. Each time that you run the experiment, you change the contents of the setup function but always use the same name. The first time that you run the experiment, you use the default classification network provided by the setup function template. The second time that you run the experiment, you modify the setup function to load a pretrained GoogLeNet network, replacing the final layers with new layers for transfer learning. For an example that uses these two network architectures, see Create a Deep Learning Experiment for Classification.
On the first Experiment Result pane, click the View Experiment Source link. Experiment Manager opens an Experiment Source pane that contains the experiment definition that produced the first set of results. Click the link at the bottom of the pane to open the setup function that you used the first time you ran the experiment. You can copy this setup function to rerun the experiment using a simple classification network.
On the second Experiment Result pane, click the View Experiment Source link. Experiment Manager opens an Experiment Source pane that contains the experiment definition that produced the second set of results. Click the link at the bottom of the pane to open the setup function that you used the second time you ran the experiment. You can copy this setup function to rerun the experiment using transfer learning.
Experiment Manager stores a copy of the setup and custom metric functions that you use, so you do not have to manually rename these functions when you modify and rerun an experiment.
To visualize, build, and train a network without sweeping hyperparameters, try the Deep Network Designer app.
You have a modified version of this example. Do you want to open this example with your edits?