Visualizes logits as raw class scores, then shows how softmax exponentiates and normalizes them into probabilities. The animation cycles through adding a constant shift (showing invariance), a naive exponentiation step that can overflow, and the numerically-stable log-sum-exp trick (subtracting max logit) while keeping the resulting probabilities unchanged.
Three-column blocky panel: logits -> exp & sum -> probabilities. Animation cycles every ~3.6s through 4 conceptual steps; a global shift c(t) demonstrates softmax invariance, and step 4 computes stable softmax via subtracting max. Bars are grid-snapped for a retro aesthetic; exp bars use log scaling to stay drawable.