A four-panel, green-on-black animated dashboard showing how finite-precision arithmetic causes (1) catastrophic cancellation that becomes severe under ill-conditioning (high κ), (2) overflow/underflow in exp-based computations via naive softmax contrasted with a stable log-sum-exp shift, and (3) rounding/quantization spacing that grows with |x| in proportion to ε_machine. A cycling checklist highlights common algebraic reformulations and scaling/normalization fixes used in deep learning to keep computations in safe numeric ranges and reduce error amplification.
Uses a 2x2 panel layout with grid-snapped rectangles for a blocky aesthetic. Animations are time-based (3.6–4.2s cycles) with provided ease() shaping. Softmax overflow/underflow is illustrated by simulating float32 exp limits (~±88) while stable softmax shifts by max. Rounding visualization uses ε_machine≈2.22e-16 and shows spacing ~ ε·|x|. No external dependencies; pure Canvas 2D API.