Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session A: Tuesday, August 12, 1:30 – 4:30 pm, de Brug & E‑Hall
Partially recurrent neural networks maximize performance and minimize wiring
Marcus Ghosh1, Dan F. M. Goodman1; 1Imperial College London
Presenter: Marcus Ghosh
Many circuits in the brain are bidirectional and sparse. Meaning that signals flow from sensory inputs to later areas and back; yet, between any two connected areas there exist some but not all pathways. What advantages or disadvantages do these architectures confer, compared to feedforward or fully connected networks? To address this question, we introduce a new class of partially recurrent neural network architectures, between these two extremes. An exhaustive search of these architectures reveals significant differences in their performance, learning speed and robustness to noise. Though, surprisingly, many perform as well as, or even better than, fully connected networks, despite having fewer parameters (a proxy for wiring cost). To explain these functional differences, we show that different architectures learn distinct input-output mappings and memory dynamics, both of which are predictive of function. Ultimately, our results demonstrate that partial recurrence allows networks to maximize performance with minimal wiring. More broadly, our work provides a general framework for linking network structure to function.
Topic Area: Brain Networks & Neural Dynamics
Extended Abstract: Full Text PDF