Apply leaky rectified linear unit activation
The leaky rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is multiplied by a fixed scale factor.
This operation is equivalent to
Note
This function applies the leaky ReLU operation to dlarray
data. If
you want to apply leaky ReLU activation within a layerGraph
object
or Layer
array, use
the following layer: