Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session C: Friday, August 15, 2:00 – 5:00 pm, de Brug & E‑Hall
Auditory Object Formation in Temporally Complex Scenes
Berfin Bastug1, Yue Sun, Erich Schroeger, David Poeppel2; 1Universität Leipzig, 2New York University
Presenter: Berfin Bastug
The auditory system decomposes boundary-less sensory input into meaningful units through Auditory Scene Analysis (ASA) (Bregman, 1990). Repetition helps listeners segregate overlapping sounds and identify distinct auditory objects (McDermott et al., 2011). Previous studies suggest that repeated units in noisy or ambiguous contexts can eventually be perceived as stable auditory objects (Barczak et al., 2018; McDermott et al., 2011), though the behavioral dynamics of this process remain unclear. We investigated this build-up process using 'tone cloud' stimuli. By manipulating repetition strength and unit duration of tone cloud units, we created auditory analogues of the motion coherence paradigm. Participants completed repetition detection and sensorimotor synchronization tasks, allowing us to examine how the accumulation of sensory evidence supports the emergence and stabilization of auditory objects. Results reveal sigmoidal, quasi-categorical performance in both tasks. In detection, performance improves earlier for shorter durations. Interestingly, In synchronization, performance converges across durations, showing that once an object emerges, it can be tracked equally well regardless of duration. Our results suggest a categorical shift in perception, with stabilization occurring after sufficient repetition.
Topic Area: Object Recognition & Visual Attention
Extended Abstract: Full Text PDF