F1 Score Visualization

Exploring the Harmonic Mean of Precision & Recall

Threshold (t) Metric Value
Precision Recall
// Move the slider to begin the calculation
Threshold Parameter (t)
Current F1 Score = 0.00

How This Visualization Works

The F1 score is a popular metric for classification tasks, especially with imbalanced datasets. It elegantly combines two other key metrics: Precision and Recall. The slider represents a changing decision threshold in a model.

As you move the slider, you trace a path along both curves. Notice their inverse relationship: typically, as you adjust the threshold to increase Recall (capturing more true positives), you often sacrifice Precision (introducing more false positives), and vice-versa. The F1 score, as the harmonic mean of the two, seeks a balance. It is highest when both Precision and Recall are high and balanced.