Electronic Thesis and Dissertation Repository

Validation of a virtual auditory space, and its use to investigate how pitch and spatial cues contribute to perceptual segregation of auditory streams

Nima Zargarnezhad, Western University

Abstract

The human auditory system can decompose complex sound mixtures into distinct perceptual auditory objects through a process (or processes) known as Auditory Scene Analysis. Pitch and spatial cues are among the sound attributes known to influence sequential streaming (Plack 2018). In this project, the fidelity of a virtual acoustic space (the Audio Dome) in reproducing precisely located sound sources with a 9th-order ambisonics algorithm was validated. The estimated horizontal Minimum Audible Angles aligned with previously reported values (Mills 1958) homogeneously across the space, and a robust low-frequency presentation was identified. Then, the Audio Dome was utilized to test van Noorden's (1975) ABA paradigm with displaced A and B sources on a continuum of locations and several pitch differences. A two-dimensional sigmoid function was utilized to model this two-dimensional psychophysical space and revealed that spatial and pitch cues are both essential to organize perception, with pitch cues perhaps being more influential.