Log Loss (Cross-Entropy) — penalizes incorrect probabilities
Log Loss plot Log Loss Predicted probability p 0 1 2 3 4+ 0.0 0.25 0.50 0.75 1.0 LogLoss = 0.69
−log(p) when y=1 −log(1−p) when y=0
Predicted probability (p)
True label (y)
Animation
Speed
p0.50
y1
LogLoss0.69
Log Loss evaluates probabilistic predictions: −( y·ln p + (1−y)·ln(1−p) ) (averaged over samples). Confident mistakes (e.g., p≈0 but y=1) incur a very large penalty.
Drag p or press Play. When y = 1, the active curve is −log(p); when y = 0, the active curve is −log(1−p). Lower is better.