← ~/visualizations

computational-graphs

Visualizes a small computational graph where each node is an operation f(…) and each directed edge carries a tensor/value. The animation alternates between a forward pass (values move left→right through the graph) and a backward pass (gradients d/dx propagate right→left along the same dependencies), illustrating reverse-mode autodiff (backprop) via the chain rule.

canvasclick to interact
t=0s

practical uses

  • 01.Understanding how neural network layers compose and how intermediate activations feed downstream operations
  • 02.Debugging model implementations by locating where values/gradients should flow in forward/backward passes
  • 03.Explaining automatic differentiation systems (PyTorch/TF/JAX) and how parameter gradients dL/dw, dL/db are produced

technical notes

Pure Canvas2D drawing with a snapped grid for a blocky aesthetic. Edges are drawn as elbow arrows; value-flow uses solid arrows and gradient-flow uses dashed arrows. A timed 4.2s cycle highlights one edge at a time and animates a square packet along it; the active pass changes text/opacity to emphasize forward vs backward propagation.