Document Type


Publication Date






URL with Digital Object Identifier



Integrating sensory information from multiple modalities leads to more precise and efficient perception and behaviour. The process of determining which sensory information should be perceptually bound is reliant on both low-level stimulus features, as well as multisensory associations learned throughout development based on the statistics of our environment. Here, we explored the relationship between multisensory associative learning and multisensory integration using encephalography (EEG) and behavioural measures. Sixty-one participants completed a three-phase study. First, participants were exposed to novel audiovisual shape-tone pairings with frequent and infrequent stimulus pairings and completed a target detection task. EEG recordings of the mismatch negativity (MMN) and P3 were calculated as neural indices of multisensory associative learning. Next, the same learned stimulus pairs were presented in audiovisual as well as unisensory auditory and visual modalities while both early (<100 ms) and late neural indices of multisensory integration were recorded. Finally, participants completed an analogous behavioural speeded-response task, with behavioural indices of multisensory gain calculated using the Race Model. Significant relationships were found in fronto-central and occipital areas between neural measures of associative learning and both early and late indices of multisensory integration in frontal and centro-parietal areas, respectively. Participants who showed stronger indices of associative learning also exhibited stronger indices of multisensory integration of the stimuli they learned to associate. Furthermore, a significant relationship was found between neural index of early multisensory integration and behavioural indices of multisensory gain. These results provide insight into the neural underpinnings of how higher-order processes such as associative learning guide multisensory integration.