Back to Search Start Over

Audio-visual integration during overt visual attention

Authors :
Cliodhna Quigley
Selim Onat
Sue Harding
Martin Cooke
Peter König
Source :
Journal of Eye Movement Research, Vol 1, Iss 2 (2008)
Publication Year :
2008
Publisher :
Bern Open Publishing, 2008.

Abstract

How do different sources of information arising from different modalities interact to control where we look? To answer this question with respect to real-world operational conditions we presented natural images and spatially localized sounds in (V)isual, Audio-visual (AV) and (A)uditory conditions and measured subjects' eye-movements. Our results demonstrate that eye-movements in AV conditions are spatially biased towards the part of the image corresponding to the sound source. Interestingly, this spatial bias is dependent on the probability of a given image region to be fixated (saliency) in the V condition. This indicates that fixation behaviour during the AV conditions is the result of an integration process. Regression analysis shows that this integration is best accounted for by a linear combination of unimodal saliencies.

Details

Language :
English
ISSN :
19958692
Volume :
1
Issue :
2
Database :
Directory of Open Access Journals
Journal :
Journal of Eye Movement Research
Publication Type :
Academic Journal
Accession number :
edsdoj.3c1f180802674426b772de993f9627a2
Document Type :
article
Full Text :
https://doi.org/10.16910/jemr.1.2.4