Leaky Rectified Linear Unit (ReLU) layer
A leaky ReLU layer performs a threshold operation, where any input value less than zero is multiplied by a fixed scalar.
This operation is equivalent to:
[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.