Rectified linear unit (ReLU)
The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity
Rectified linear unit (ReLU)
The Rectified linear unit (ReLU) is an activation function that cuts the lower end off the identity