Exponential linear unit (ELU) layer
An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
The layer performs the following operation:
The default value of α is 1. Specify a value of
α for the layer by setting the Alpha
property.
creates an ELU
layer.layer
= eluLayer
creates an ELU layer and specifies the layer
= eluLayer(alpha
)Alpha
property.
[1] Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and accurate deep network learning by exponential linear units (ELUs)." arXiv preprint arXiv:1511.07289 (2015).
batchNormalizationLayer
| clippedReluLayer
| leakyReluLayer
| reluLayer
| trainNetwork