leakyrelu

Apply leaky rectified linear unit activation

Description

The leaky rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is multiplied by a fixed scale factor.

This operation is equivalent to

f(x)={x,x0scale*x,x<0.

Note

This function applies the leaky ReLU operation to dlarray data. If you want to apply leaky ReLU activation within a layerGraph object or Layer array, use the following layer:

example

dlY = leakyrelu(dlX) computes the leaky ReLU activation of the input dlX by applying a threshold operation. All values in dlX less than zero are multiplied by a default scale factor of 0.01.

dlY = leakyrelu(dlX,scaleFactor) specifies the scale factor for the leaky ReLU operation.

Examples

collapse all

Use the leakyrelu function to scale negative values in the input data.

Create the input data as a single observation of random values with a height and width of 12 and 32 channels.

height = 12;
width = 12;
channels = 32;
observations = 1;

X = randn(height,width,channels,observations);
dlX = dlarray(X,'SSCB');

Compute the leaky ReLU activation using a scale factor of 0.05 for the negative values in the input.

dlY = leakyrelu(dlX,0.05);

Input Arguments

collapse all

Input data, specified as a dlarray with or without dimension labels.

Data Types: single | double

Scale factor for negative inputs, specified as a numeric scalar. The default value is 0.01.

Data Types: single | double

Output Arguments

collapse all

Leaky ReLU activations, returned as a dlarray. The output dlY has the same underlying data type as the input dlX.

If the input data dlX is a formatted dlarray, dlY has the same dimension labels as dlX. If the input data is not a formatted dlarray, dlY is an unformatted dlarray with the same dimension order as the input data.

Extended Capabilities

Introduced in R2019b