← ~/visualizations

mutual-information

Shows a joint distribution P(X,Y) as a blocky 8×8 heatmap, with marginals P(X), P(Y). The animation cycles between independence (P(X,Y)=P(X)P(Y), diff grid near zero, I(X;Y)≈0) and dependence (diagonal structure, diff grid lights up, I(X;Y)>0). A decomposition bar visualizes I(X;Y)=H(X)−H(X|Y) by splitting H(X) into the shared part (mutual information) and the remaining uncertainty (conditional entropy).

canvasclick to interact
t=0s

practical uses

  • 01.Feature selection: measure how informative a feature is about a label
  • 02.Detecting dependence/correlation beyond linear correlation (nonlinear relationships)
  • 03.Clustering and representation learning (e.g., mutual information-based objectives)

technical notes

Computes entropies and mutual information in bits from an 8×8 synthetic joint distribution that smoothly blends between an independent product-of-marginals model and a diagonal-biased dependent model. Uses time-based easing to linger at endpoints, renders blocky snapped rectangles, and includes an animated scanline to illustrate how conditioning on Y narrows uncertainty in X.