Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session C: Friday, August 15, 2:00 – 5:00 pm, de Brug & E‑Hall
Decoding Colour Information from EEG Signals in Natural Scenes
Arash Akbarinia1; 1Justus Liebig University Giessen
Presenter: Arash Akbarinia
Recent years have seen major advances in decoding perceptual information from EEG signals, driven by developments in artificial intelligence—particularly large datasets, transformer-based models, and contrastive learning. Here, we investigate whether colour information can be decoded from EEG signals recorded while participants viewed natural scenes. While earlier work shows that colour features (e.g., the hue circle) are decodable under controlled conditions, it is unclear whether such information persists in complex, real-world settings. Using the THINGS-EEG2 dataset, we analyse EEG recordings from a 64-electrode cap as participants viewed natural images for 100 ms each in a rapid serial visual presentation (RSVP) paradigm. To define colour ground truth, we apply the Segment Anything Model (SAM) to segment images into foreground and background, quantifying each segment's colour using common categories from human colour naming studies. An artificial neural network is trained to predict scene colour content from EEG signals alone, and performance is evaluated by comparing predicted and ground-truth colours for each region. Our findings show that EEG signals retain decodable colour information even in object recognition tasks without explicit colour references, offering new insights into the brain's colour representation and opening doors for naturalistic brain-computer interfaces and neuroimaging research.
Topic Area: Visual Processing & Computational Vision
Extended Abstract: Full Text PDF