Log Loss (Cross-Entropy) — penalizes incorrect probabilities
Predicted probability (p)
True label (y)
Animation
Speed
p0.50
y1
LogLoss0.69
Log Loss evaluates probabilistic predictions:
−( y·ln p + (1−y)·ln(1−p) ) (averaged over samples).
Confident mistakes (e.g., p≈0 but y=1) incur a very large penalty.
Drag p or press Play. When y = 1, the active curve is −log(p); when y = 0, the active curve is −log(1−p).
Lower is better.