Apply rectified linear unit activation
The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.
This operation is equivalent to
Note
This function applies the ReLU operation to dlarray
data. If
you want to apply ReLU activation within a layerGraph
object
or Layer
array, use
the following layer: