Animated green-on-black visualization of cross-entropy as the average surprise of a model q on samples from a true distribution p. The left panel shows evolving class probabilities for p (true) and q (model). The right panel alternates between (1) the decomposition H(p,q)=H(p)+KL(p||q) with a stacked bar, and (2) the one-hot classification case where H(p,q) reduces to the negative log-likelihood -log q(y) of the correct class.
Uses deterministic time-based morphing of 3-class distributions p and q (q lags and wobbles to create KL). Computes entropy, cross-entropy, and KL in nats (natural log). Renders block-snapped bars and labels on a scanlined black background; cycles between decomposition and one-hot views every ~3.8s using provided cubic ease().