← ~/visualizations

cross-entropy

Animated green-on-black visualization of cross-entropy as the average surprise of a model q on samples from a true distribution p. The left panel shows evolving class probabilities for p (true) and q (model). The right panel alternates between (1) the decomposition H(p,q)=H(p)+KL(p||q) with a stacked bar, and (2) the one-hot classification case where H(p,q) reduces to the negative log-likelihood -log q(y) of the correct class.

canvasclick to interact
t=0s

practical uses

  • 01.Classification loss for neural networks (softmax + cross-entropy)
  • 02.Measuring how well a probabilistic model q matches observed data p
  • 03.Deriving and interpreting KL divergence as the gap between entropy and cross-entropy

technical notes

Uses deterministic time-based morphing of 3-class distributions p and q (q lags and wobbles to create KL). Computes entropy, cross-entropy, and KL in nats (natural log). Renders block-snapped bars and labels on a scanlined black background; cycles between decomposition and one-hot views every ~3.8s using provided cubic ease().