leakyReluLayer

Leaky Rectified Linear Unit (ReLU) layer

Description

A leaky ReLU layer performs a threshold operation, where any input value less than zero is multiplied by a fixed scalar.

This operation is equivalent to:

f(x)={x,x0scale*x,x<0.

Creation

Description

layer = leakyReluLayer returns a leaky ReLU layer.

layer = leakyReluLayer(scale) returns a leaky ReLU layer with a scalar multiplier for negative inputs equal to scale.

example

layer = leakyReluLayer(___,'Name',Name) returns a leaky ReLU layer and sets the optional Name property.

Properties

expand all

Leaky ReLU

Scalar multiplier for negative input values, specified as a numeric scalar.

Example: 0.4

Layer

Layer name, specified as a character vector or a string scalar. To include a layer in a layer graph, you must specify a nonempty unique layer name. If you train a series network with the layer and Name is set to '', then the software automatically assigns a name to the layer at training time.

Data Types: char | string

Number of inputs of the layer. This layer accepts a single input only.

Data Types: double

Input names of the layer. This layer accepts a single input only.

Data Types: cell

Number of outputs of the layer. This layer has a single output only.

Data Types: double

Output names of the layer. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a leaky ReLU layer with the name 'leaky1' and a scalar multiplier for negative inputs equal to 0.1.

layer = leakyReluLayer(0.1,'Name','leaky1')
layer = 
  LeakyReLULayer with properties:

     Name: 'leaky1'

   Hyperparameters
    Scale: 0.1000

Include a leaky ReLU layer in a Layer array.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(3,16)
    batchNormalizationLayer
    leakyReluLayer
    
    maxPooling2dLayer(2,'Stride',2)
    convolution2dLayer(3,32)
    batchNormalizationLayer
    leakyReluLayer
    
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer]
layers = 
  11x1 Layer array with layers:

     1   ''   Image Input             28x28x1 images with 'zerocenter' normalization
     2   ''   Convolution             16 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   Batch Normalization     Batch normalization
     4   ''   Leaky ReLU              Leaky ReLU with scale 0.01
     5   ''   Max Pooling             2x2 max pooling with stride [2  2] and padding [0  0  0  0]
     6   ''   Convolution             32 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     7   ''   Batch Normalization     Batch normalization
     8   ''   Leaky ReLU              Leaky ReLU with scale 0.01
     9   ''   Fully Connected         10 fully connected layer
    10   ''   Softmax                 softmax
    11   ''   Classification Output   crossentropyex

References

[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.

Extended Capabilities

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Introduced in R2017b