Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session C: Friday, August 15, 2:00 – 5:00 pm, de Brug & E‑Hall
Langevin Flows for Modeling Neural Latent Dynamics
Yue Song1, T. Anderson Keller2, Yisong Yue1, Pietro Perona1, Max Welling3; 1California Institute of Technology, 2Harvard University, 3University of Amsterdam
Presenter: T. Anderson Keller
We propose LangevinFlow, a sequential variational model for neural population activity, where latent dynamics are governed by the underdamped Langevin equation. This framework captures both intrinsic neural dynamics and external unobserved inputs through physically grounded priors -- incorporating inertia, damping, stochasticity, and a learned potential landscape. The potential is parameterized as a locally coupled oscillator network, biasing the model toward oscillatory and flow-like behaviors observed in real neural circuits. Our architecture combines a recurrent encoder, a one-layer Transformer decoder, and structured Langevin dynamics in the latent space. LangevinFlow achieves strong empirical results: it closely tracks ground-truth firing rates on synthetic data driven by a Lorenz attractor, and outperforms prior methods on the Neural Latents Benchmark across four datasets in terms of both bits-per-spike and forward prediction. It also matches or exceeds baselines in decoding behavioral variables such as hand velocity. This work introduces a compact, physics-inspired, interpretable, and high-performing model for neural population dynamics.
Topic Area: Brain Networks & Neural Dynamics
Extended Abstract: Full Text PDF