Example
Let’s consider a three-class classification problem with classes A, B, and C. Suppose we have a single data point with the true class label being A. The true label in one-hot encoded form would be [1, 0, 0].
Assume the model predicts the following probabilities for this data point:
- Probability of class A: 0.7
- Probability of class B: 0.2
- Probability of class C: 0.1
The predicted probability vector is [0.7, 0.2, 0.1].
To calculate the cross entropy loss for this example, we use the formula:
Substituting the values:
- For class A: and
- For class B: and
- For class C: and
The cross entropy loss is calculated as:
So, the cross entropy loss for this example is approximately 0.3567. This value represents the penalty for the model’s predicted probabilities not perfectly matching the true class distribution. The lower the loss, the better the model’s predictions align with the true labels.
Script Description:
- Cross Entropy Function: Computes the cross entropy loss given true labels and predicted probabilities.
- True and Predicted Probabilities Visualization: Bar plots display the true one-hot encoded labels and the predicted probability distribution.
- Cross Entropy Loss Calculation: Prints the loss value for a sample data point.
- Loss Curve: A line graph shows how the loss changes as the predicted probability for the true class increases.