← ~/visualizations

automatic-differentiation

Visualizes a tiny program trace (mul → sin → add) as a computational graph and shows how automatic differentiation applies the chain rule by composing local Jacobians. The animation alternates between forward-mode (pushing value+tangent left-to-right) and reverse-mode (pulling adjoints right-to-left) to highlight their different propagation directions and cost scaling.

canvasclick to interact
t=0s

practical uses

  • 01.Understanding backpropagation as reverse-mode AD in deep learning
  • 02.Estimating gradient computation cost tradeoffs (forward vs reverse) when choosing implementations
  • 03.Debugging/implementing custom differentiable ops by reasoning about local derivatives and their composition

technical notes

Single self-contained Canvas2D draw function. Uses a 4–8px snapped grid for a retro blocky look, GREEN/GREEN_DIM on black, and a 4.2s cycle that alternates forward and reverse sweeps with cubic easing (ease). The visualization is symbolic (local Jacobians labeled) to emphasize chain-rule composition over numeric evaluation.