Visualizes a deep network as a layered composition f_θ(x)=f_L(...f_2(f_1(x))). Animated packets flow left-to-right through layer blocks, while each layer’s representation vector h^l lights up as features are transformed into higher-level abstractions. A cycling “inductive bias” panel switches between CNN (locality + weight sharing), RNN (recurrence), and Attention (global mixing), showing how architecture constrains which connections exist and thus which function families are parameter-efficient and generalize well.
Uses a 3.2s forward-pass phase for smooth highlighting across layers; packets are lightweight stateful elements updated with dt. Blocky aesthetic is enforced via grid snapping and strokeRect/fillRect primitives; bias modes change the rendered connection topology (local+shared kernels, recurrence loop, or global fan connections).