Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers

Contributed Talk Session: Friday, August 15, 12:00 – 1:00 pm, Room C1.03
Poster Session C: Friday, August 15, 2:00 – 5:00 pm, de Brug & E‑Hall

Representational Geometry Dynamics in Networks After Long-Range Modulatory Feedback

Kexin Cindy Luo1, George A. Alvarez1, Talia Konkle1; 1Harvard University

Presenter: Kexin Cindy Luo

The human visual system employs extensive long-range feedback circuitry, where feedforward and feedback connections iteratively refine interpretations through reentrant loops (Di Lollo, 2012). Inspired by this neuroanatomy, a recent computational model incorporated long-range modulatory feedback into a convolutional neural network (Konkle & Alvarez, 2023). While this prior work focused on injecting an external goal signal to leverage feedback for category-based attention, here we investigated its default operation: how learned feedback intrinsically reshapes representational geometry without top-down goals. Analyzing activations from this model across two passes—feedforward versus modulated—on ImageNet data, we examined local (within-category) and global (between-category) structure. Our results demonstrate that feedback significantly compacts category clusters: exemplars move closer to prototypes, and the local structure improves as more near neighbors fall within the same category. Notably, this occurs while largely preserving global structure, as between-category distances remain relatively stable. An exploratory analysis linking local and global changes suggested a positive relationship between local compaction and prototype shifts. These findings reveal an emergent "prototype effect" where fixed long-range feedback automatically refines local representations, potentially enhancing categorical processing efficiency without disrupting overall representational organization. This suggests intrinsic feedback dynamics might contribute fundamentally to perceptual organization.

Topic Area: Object Recognition & Visual Attention

Extended Abstract: Full Text PDF