← ~/visualizations

softmax-and-logits

Visualizes logits as raw class scores, then shows how softmax exponentiates and normalizes them into probabilities. The animation cycles through adding a constant shift (showing invariance), a naive exponentiation step that can overflow, and the numerically-stable log-sum-exp trick (subtracting max logit) while keeping the resulting probabilities unchanged.

canvasclick to interact
t=0s

practical uses

  • 01.Interpreting classifier outputs: logits vs probabilities (confidence)
  • 02.Computing attention weights in transformers via softmax over similarity scores
  • 03.Implementing stable cross-entropy / softmax for training without numerical overflow

technical notes

Three-column blocky panel: logits -> exp & sum -> probabilities. Animation cycles every ~3.6s through 4 conceptual steps; a global shift c(t) demonstrates softmax invariance, and step 4 computes stable softmax via subtracting max. Bars are grid-snapped for a retro aesthetic; exp bars use log scaling to stay drawable.