← ~/visualizations

curse-of-dimensionality

Three synchronized panels animate how increasing dimensionality d (with fixed sample count n) changes geometry and learning: (1) volume scales like r^d so “central” mass collapses quickly, (2) the same n samples occupy a vanishing fraction of exponentially many coarse bins (sparsity), and (3) pairwise distances concentrate as relative spread shrinks ~1/√d, reducing distance contrast and raising sample complexity.

canvasclick to interact
t=0s

practical uses

  • 01.Explaining why high-dimensional feature spaces need regularization and/or dimensionality reduction
  • 02.Building intuition for why nearest-neighbor and distance-based methods degrade in high dimensions
  • 03.Motivating architectural choices (bottlenecks, embeddings) and data requirements for deep models

technical notes

Uses responsive scaling (scale=min(w,h)/240), grid snapping for a retro blocky look, and a 4.2s ease-timed cycle that sweeps d from 1..12. Sparsity uses an occupancy approximation bins*(1-exp(-n/bins)) for bins=2^d; distance concentration uses analytic mean d/6 and std sqrt(d*7/180) for uniform [0,1]^d distances, visualized as a narrowing band and a shrinking contrast meter.