getL2Factor

Get L2 regularization factor of layer learnable parameter

Description

example

factor = getL2Factor(layer,parameterName) returns the L2 regularization factor of the parameter with the name parameterName in layer.

For built-in layers, you can get the L2 regularization factor directly by using the corresponding property. For example, for a convolution2dLayer layer, the syntax factor = getL2Factor(layer,'Weights') is equivalent to factor = layer.WeightL2Factor.

example

factor = getL2Factor(layer,parameterPath) returns the L2 regularization factor of the parameter specified by the path parameterPath. Use this syntax when the parameter is in a dlnetwork object in a custom layer.

example

factor = getL2Factor(dlnet,layerName,parameterName) returns the L2 regularization factor of the parameter with the name parameterName in the layer with name layerName for the specified dlnetwork object.

example

factor = getL2Factor(dlnet,parameterPath) returns the L2 regularization factor of the parameter specified by the path parameterPath. Use this syntax when the parameter is in a nested layer.

Examples

collapse all

Set and get the L2 regularization factor of a learnable parameter of a layer.

Define a custom PReLU layer. To create this layer, save the file preluLayer.m in the current folder.

Create a layer array including a custom layer preluLayer.

layers = [ ...
    imageInputLayer([28 28 1])
    convolution2dLayer(5,20)
    batchNormalizationLayer
    preluLayer(20,'prelu')
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer];

Set the L2 regularization factor of the 'Alpha' learnable parameter of the preluLayer to 2.

layers(4) = setL2Factor(layers(4),'Alpha',2);

View the updated L2 regularization factor.

factor = getL2Factor(layers(4),'Alpha')
factor = 2

Set and get the L2 regularization factor of a learnable parameter of a nested layer.

Create a residual block layer using the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

inputSize = [224 224 64];
numFilters = 64;
layer = residualBlockLayer(inputSize,numFilters)
layer = 
  residualBlockLayer with properties:

       Name: ''

   Learnable Parameters
    Network: [1x1 dlnetwork]

  Show all properties

View the layers of the nested network.

layer.Network.Layers
ans = 
  8x1 Layer array with layers:

     1   'in'      Image Input           224x224x64 images
     2   'conv1'   Convolution           64 3x3x64 convolutions with stride [1  1] and padding 'same'
     3   'gn1'     Group Normalization   Group normalization with 64 channels split into 1 groups
     4   'relu1'   ReLU                  ReLU
     5   'conv2'   Convolution           64 3x3x64 convolutions with stride [1  1] and padding 'same'
     6   'gn2'     Group Normalization   Group normalization with 64 channels split into 64 groups
     7   'add'     Addition              Element-wise addition of 2 inputs
     8   'relu2'   ReLU                  ReLU

Set the L2 regularization factor of the learnable parameter 'Weights' of the layer 'conv1' to 2 using the setL2Factor function.

factor = 2;
layer = setL2Factor(layer,'Network/conv1/Weights',factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(layer,'Network/conv1/Weights')
factor = 2

Set and get the L2 regularization factor of a learnable parameter of a dlnetwork object.

Create a dlnetwork object.

layers = [
    imageInputLayer([28 28 1],'Normalization','none','Name','in')
    convolution2dLayer(5,20,'Name','conv')
    batchNormalizationLayer('Name','bn')
    reluLayer('Name','relu')
    fullyConnectedLayer(10,'Name','fc')
    softmaxLayer('Name','sm')];

lgraph = layerGraph(layers);

dlnet = dlnetwork(lgraph);

Set the L2 regularization factor of the 'Weights' learnable parameter of the convolution layer to 2 using the setL2Factor function.

factor = 2;
dlnet = setL2Factor(dlnet,'conv','Weights',factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(dlnet,'conv','Weights')
factor = 2

Set and get the L2 regularization factor of a learnable parameter of a nested layer in a dlnetwork object.

Create a dlnetwork object containing the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

inputSize = [224 224 3];
numFilters = 32;
numClasses = 5;

layers = [
    imageInputLayer(inputSize,'Normalization','none','Name','in')
    convolution2dLayer(7,numFilters,'Stride',2,'Padding','same','Name','conv')
    groupNormalizationLayer('all-channels','Name','gn')
    reluLayer('Name','relu')
    maxPooling2dLayer(3,'Stride',2,'Name','max')
    residualBlockLayer([56 56 numFilters],numFilters,'Name','res1')
    residualBlockLayer([56 56 numFilters],numFilters,'Name','res2')
    residualBlockLayer([56 56 numFilters],2*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res3')
    residualBlockLayer([28 28 2*numFilters],2*numFilters,'Name','res4')
    residualBlockLayer([28 28 2*numFilters],4*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res5')
    residualBlockLayer([14 14 4*numFilters],4*numFilters,'Name','res6')
    globalAveragePooling2dLayer('Name','gap')
    fullyConnectedLayer(numClasses,'Name','fc')
    softmaxLayer('Name','sm')];

lgraph = layerGraph(layers);
dlnet = dlnetwork(lgraph);

The Learnables property of the dlnetwork object is a table that contains the learnable parameters of the network. The table includes parameters of nested layers in separate rows. View the learnable parameters of the layer "res1".

learnables = dlnet.Learnables;
idx = learnables.Layer == "res1";
learnables(idx,:)
ans=8×3 table
    Layer            Parameter                  Value       
    ______    _______________________    ___________________

    "res1"    "Network/conv1/Weights"    {3x3x32x32 dlarray}
    "res1"    "Network/conv1/Bias"       {1x1x32    dlarray}
    "res1"    "Network/gn1/Offset"       {1x1x32    dlarray}
    "res1"    "Network/gn1/Scale"        {1x1x32    dlarray}
    "res1"    "Network/conv2/Weights"    {3x3x32x32 dlarray}
    "res1"    "Network/conv2/Bias"       {1x1x32    dlarray}
    "res1"    "Network/gn2/Offset"       {1x1x32    dlarray}
    "res1"    "Network/gn2/Scale"        {1x1x32    dlarray}

For the layer "res1", set the L2 regularization factor of the learnable parameter 'Weights' of the layer 'conv1' to 2 using the setL2Factor function.

factor = 2;
dlnet = setL2Factor(dlnet,'res1/Network/conv1/Weights',factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(dlnet,'res1/Network/conv1/Weights')
factor = 2

Input Arguments

collapse all

Input layer, specified as a scalar Layer object.

Parameter name, specified as a character vector or a string scalar.

Path to parameter in nested layer, specified as a string scalar or a character vector. A nested layer is a custom layer that itself defines a layer graph as a learnable parameter.

If the input to getL2Factor is a nested layer, then the parameter path has the form "propertyName/layerName/parameterName", where:

  • propertyName is the name of the property containing a dlnetwork object

  • layerName is the name of the layer in the dlnetwork object

  • parameterName is the name of the parameter

If there are multiple levels of nested layers, then specify each level using the form "propertyName1/layerName1/.../propertyNameN/layerNameN/parameterName", where propertyName1 and layerName1 correspond to the layer in the input to the getL2Factor function, and the subsequent parts correspond to the deeper levels.

Example: For layer input to getL2Factor, the path "Network/conv1/Weights" specifies the "Weights" parameter of the layer with name "conv1" in the dlnetwork object given by layer.Network.

If the input to getL2Factor is a dlnetwork object and the desired parameter is in a nested layer, then the parameter path has the form "layerName1/propertyName/layerName/parameterName", where:

  • layerName1 is the name of the layer in the input dlnetwork object

  • propertyName is the property of the layer containing a dlnetwork object

  • layerName is the name of the layer in the dlnetwork object

  • parameterName is the name of the parameter

If there are multiple levels of nested layers, then specify each level using the form "layerName1/propertyName1/.../layerNameN/propertyNameN/layerName/parameterName", where layerName1 and propertyName1 correspond to the layer in the input to the getL2Factor function, and the subsequent parts correspond to the deeper levels.

Example: For dlnetwork input to getL2Factor, the path "res1/Network/conv1/Weights" specifies the "Weights" parameter of the layer with name "conv1" in the dlnetwork object given by layer.Network, where layer is the layer with name "res1" in the input network dlnet.

Data Types: char | string

Network for custom training loops, specified as a dlnetwork object.

Layer name, specified as a string scalar or a character vector.

Data Types: char | string

Output Arguments

collapse all

L2 regularization factor for the parameter, returned as a nonnegative scalar.

The software multiplies this factor by the global L2 regularization factor to determine the L2 regularization for the specified parameter. For example, if factor is 2, then the L2 regularization for the specified parameter is twice the current global L2 regularization. The software determines the global L2 regularization based on the settings specified with the trainingOptions function.

Introduced in R2017b