← ~/visualizations

information-theoretic-bounds

Visualizes entropy H(X) as the minimum bits needed to specify a solution, then animates an observation Y that transfers mutual information I(X;Y) (shown as an entropy ‘chunk’ removed). The remaining conditional entropy H(X|Y) is displayed alongside a Fano-style gauge indicating an unavoidable lower bound on probability of error when uncertainty remains.

canvasclick to interact
t=0s

practical uses

  • 01.Deriving lower bounds for sample complexity (how much data is needed) in estimation and learning
  • 02.Reasoning about limits of communication/compression and the best-possible performance of codes
  • 03.Proving impossibility results / minimax lower bounds for algorithms via information arguments

technical notes

Single 4s animation cycle with staged phases; entropy bars are scaled in ‘bits’ (0–2) and split into I(X;Y) and H(X|Y) to show I=H−H|. A simplified Fano proxy gauge uses H(X|Y)/H(X) for an intuitive error lower bound. Grid-snapped rectangles and monospace labels create a blocky green-on-black aesthetic; fully responsive via scale=min(w,h)/240.