Used in Neural network for non-linearity
-
Sigmoid: S-shaped curve (0 to 1) often used in binary classification. Image of Sigmoid activation function: https://en.wikipedia.org/wiki/Sigmoid_function
-
ReLU (Rectified Linear Unit): Outputs input directly if positive, otherwise 0. Simple and efficient, often used in modern deep neural networks. Image of ReLU activation function: https://en.wikipedia.org/wiki/Rectifier_(neural_networks
-
Tanh: S-shaped curve (-1 to 1), similar to sigmoid but centered around 0.
-
Softmax: Used for multi-class classification, ensuring output probabilities sum to 1. Image of Softmax activation function:
-
ReLU (Rectified Linear Unit):
- Sigmoid:
- Tanh: