Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers

Poster Session A: Tuesday, August 12, 1:30 – 4:30 pm, de Brug & E‑Hall

Universal Differential Equations as a Common Modeling Language for Neuroscience

Ahmed ElGazzar1, Marcel van Gerven2; 1Donders Institute for Brain, Cognition and Behaviour, 2Donders Institute for Brain, Cognition and Behaviour, Radboud University

Presenter: Ahmed ElGazzar

The rise of large-scale neuroscience datasets has driven widespread adoption of deep neural networks (DNNs) as models of biological neural systems. While DNNs can approximate functions directly from data circumventing the need for mechanistic modeling, they risk producing implausible and difficult-to-interpret models. In this paper, we argue for universal differential equations (UDEs) as a unifying approach for model development and validation in neuroscience. UDEs view differential equations as parameterizable, differentiable mathematical objects that can be augmented and trained with scalable deep learning techniques. This synergy facilitates the integration of classical mathematical modeling with emerging advancements in AI into a potent framework. We provide a primer on this burgeoning topic in scientific machine learning and describe a generative modeling recipe for fitting UDEs on neural and behavioral data. Our goal is to show how UDEs can fill in a critical gap between mechanistic, phenomenological, and data-driven models in neuroscience and highlight their potential to address inherent challenges across diverse applications such as understanding neural computation, controlling neural systems, neural decoding, and normative modeling.

Topic Area: Methods & Computational Tools

Extended Abstract: Full Text PDF