Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session C: Friday, August 15, 2:00 – 5:00 pm, de Brug & E‑Hall
Efficient Regularization of High-Dimensional Cerebellar Representations by Sparse Parallel Fiber Inputs: A Virtual Sample-based L2 Regularization Perspective
Jundong Kim1, Tae-Rim Yun1, Chang-Eop Kim; 1Gachon University
Presenter: Tae-Rim Yun
The cerebellum generates extremely high-dimensional representations through parallel fibers (PF) originating from the granule cell layer, enabling precise motor learning and predictive control. However, this excessive dimensionality expansion potentially risks overfitting due to surpassing the intrinsic dimensionality of the input data. This study proposes that the spontaneous and highly sparse PF inputs serve as explicit virtual data samples that effectively implement an L2 regularization mechanism analogous to multiple linear regression. Specifically, the sparse PF inputs mathematically resemble virtual samples, each having a single feature with value $\sqrt{\lambda}$ and target output y=0. Improper activation of Purkinje cells (PC) by these sparse inputs triggers error signals via climbing fibers (CF), consequently inducing long-term depression (LTD) at PF-PC synapses. This interpretation extends traditional adaptive filter theories based on the delta learning rule and integrates recent perspectives of spontaneous activity-driven pruning in generative models. Experimental validations using optogenetics and electrophysiology are proposed.
Topic Area: Methods & Computational Tools
Extended Abstract: Full Text PDF