Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers

Poster Session C: Friday, August 15, 2:00 – 5:00 pm, de Brug & E‑Hall

Early learning of the optimal constant solution in neural networks and humans

Jirko Rubruck1, Jan Philipp Bauer2, Andrew M Saxe2, Christopher Summerfield1; 1University of Oxford, 2University College London, University of London

Presenter: Jirko Rubruck

Deep neural networks learn increasingly complex functions over the course of training. Here, we show both empirically and theoretically that learning of the target function is preceded by an early phase in which networks learn the optimal constant solution (OCS) – that is, initial model responses mirror the distribution of target labels, while entirely ignoring information provided in the input. Using a hierarchical category learning task, we derive exact solutions for learning dynamics in deep linear networks trained with bias terms. Even when initialized to zero, this simple architectural feature induces substantial changes in early dynamics. We identify hallmarks of this early OCS phase and illustrate how these signatures are observed in deep linear networks and larger, nonlinear convolutional neural networks solving a hierarchical learning task based on MNIST and CIFAR10. We train human learners over the course of three days on a structurally equivalent learning task. We then identify qualitative signatures of this early OCS phase in terms of true negative rates. Surprisingly, we find the same early reliance on the OCS in the behavior of human learners. Finally, we show that learning of the OCS can emerge even in the absence of bias terms and is equivalently driven by generic correlations in the input data. Overall, our work suggests the OCS is a common phenomenon in biological and artificial, supervised, error-corrective learning, and suggests possible factors for its prevalence.

Topic Area: Brain Networks & Neural Dynamics

proceeding: Full Text on OpenReview