Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session B: Wednesday, August 13, 1:00 – 4:00 pm, de Brug & E‑Hall
Integrating Non-Classical Receptive Fields of the Primary Visual Cortex into CNNs Enhances Adversarial Robustness
Ehsan Ur Rahman Mohammed1, Elham Bagheri2, Soniya1, Apurva Narayan1, Yalda Mohsenzadeh1; 1University of Western Ontario, 2Vector Institute
Presenter: Yalda Mohsenzadeh
The adversarial robustness of artificial intelligence models remains a critical challenge for their deployment in real-world applications. Brain-inspired approaches have garnered considerable attention in recent years, given the superior adversarial robustness observed in human vision. In this study, we propose a novel architecture named nCRF-SurroundNet. nCRF-SurroundNet performs surround modulation on the responses obtained after passing images into a Gabor filter bank. Our enhancements are guided by neurophysiology findings showing that V1 receptive fields extend beyond the classical receptive field where input is directly received by the convolutional kernel, to include the non-classical receptive field (nCRF). The nCRF plays a crucial role by either inhibiting or amplifying the output of the classical receptive field depending on the context around the classical receptive field. The proposed model is evaluated against standard convolutional architectures, including AlexNet, ResNet50, and the original VOneNet, using two benchmark datasets, CIFAR-10 and ImageNet100. Performance is assessed under the threat of adversarial attacks, specifically utilizing the projected gradient descent (PGD) and Carlini and Wagner (C&W) attack methods. The results on CIFAR-10 and ImageNet100 demonstrate that our proposed models achieve significantly higher adversarial robustness against PGD attacks compared to existing models, while also maintaining relatively strong performance against CW attacks. Notably, on ImageNet100, our models surpass all other competitors on the PGD attacks and the stronger version of C&W attack.
Topic Area: Visual Processing & Computational Vision
Extended Abstract: Full Text PDF