Activation function
An activation function
gets applied in a perceptron after the weighted sum of the inputs. It is the non-linear term. Classic activation functions are
- Identity function,
- Binary step,
- Sigmoid function,
- Rectified linear unit (ReLU), or
- Hyperbolic tangent (tanh).