Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers

Poster Session A: Tuesday, August 12, 1:30 – 4:30 pm, de Brug & E‑Hall

Rotating Snakes Illusion Reveals Limitations of Visual-Motion Models in Explaining Human Vision

Isabella Elaine Rosario1, Fan L. Cheng1, Zitang Sun2; 1Columbia University, 2Kyoto University

Presenter: Isabella Elaine Rosario

Deep Neural Network (DNN) models provide a computational framework that enables rigorous understanding of vision. Recent DNN-based motion models have successfully replicated illusions like reverse-phi and barber pole, suggesting possible shared computational principles with human motion processing. However, findings have been mixed on whether DNN models can replicate the "Rotating Snakes" illusion—static patterns that induce motion perception in humans. We tested representative optical flow estimation models on both grayscale and color versions of Rotating Snakes, including those featuring recurrent architectures and different training approaches. None of the models predicted optical flows matching the continuous rotational motion humans perceive, either when presented with consecutive static images or under simulation conditions believed to trigger the illusion, such as saccadic eye movements and stimulus onset. Only the motion energy sensor and self-attention based Dual model estimated partial rotation in expected regions, matching or opposing predicted directions—an effect absent in controls. Our results highlight the gap between current DNN-based motion models and human vision. Future models tested in experimental loops should incorporate mechanisms accounting for possible explanations of the Rotating Snakes illusion, such as pupil dilation, eye movements, and contrast-dependent processing latency, as well as color- and contrast-sensitive adaptation functionality.

Topic Area: Visual Processing & Computational Vision

Extended Abstract: Full Text PDF