Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session A: Tuesday, August 12, 1:30 – 4:30 pm, de Brug & E‑Hall
Heterogeneous Effect of Input and Task-optimization on the Dynamics of Recurrent Neural Networks
Mohammad Taha Fakharian1, Alireza Ghalambor2, Arman Behrad2, Roxana Zeraati3, Shervin Safavi2; 1University of Tehran, University of Tehran, 2Technische Universität Dresden, 3Max-Planck Institute for Biological Cybernetics
Presenter: Mohammad Taha Fakharian
Reverse-engineering task-optimized recurrent neural networks (RNNs) has become a key framework to uncover mechanisms of brain computation in cognitive tasks. Tasks are often constructed as a set of inputs. Then, RNNs are optimized to achieve a set of computational sub-goals given the inputs. Then, neural dynamics in RNNs can be shaped by two major factors: the effect of input structure (defined by task) and task-based optimization or training. The former better reflects the attributes of the input to the network, while the latter better reflects the connectivity that is shaped by task-based optimization. Although both are major factors shaping the network dynamics in a task-specific fashion, how exactly these factors affect network dynamics remains elusive. Here, we investigate the effect of both factors on discriminating the neural dynamics across tasks. We systematically vary the network architecture and the input conditions, using three distinct recurrent architectures: Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and vanilla RNN (V-RNN), trained on cognitive tasks from the NeuroGym library. While we observed a vast range of heterogeneity across architectures and choices of task on task-specific dynamics, we observed that task structure (rather than task-based optimization of the connectivity) almost dominantly informs about task-specific dynamics.
Topic Area: Brain Networks & Neural Dynamics
Extended Abstract: Full Text PDF