1. Crossmodal interactions and multisensory integration in the perception of audio-visual motion — A free-field study
- Author
-
Claudia Freigang, Kristina Schmiedchen, Rudolf Rübsamen, and Ines Nitsche
- Subjects
Adult ,Male ,media_common.quotation_subject ,Motion Perception ,Motion (physics) ,Discrimination, Psychological ,Perception ,medicine ,Humans ,Auditory system ,Attention ,Motion perception ,Set (psychology) ,Molecular Biology ,media_common ,Crossmodal ,General Neuroscience ,Multisensory integration ,medicine.anatomical_structure ,Acoustic Stimulation ,Auditory Perception ,Visual Perception ,Female ,Neurology (clinical) ,Kinetic depth effect ,Psychology ,Photic Stimulation ,Developmental Biology ,Cognitive psychology - Abstract
Motion perception can be altered by information received through multiple senses. So far, the interplay between the visual and the auditory modality in peripheral motion perception is scarcely described. The present free-field study investigated audio-visual motion perception for different azimuthal trajectories in space. To disentangle effects related to crossmodal interactions (the influence of one modality on signal processing in another modality) and multisensory integration (binding of bimodal streams), we manipulated the subjects' attention in two experiments on a single set of moving audio-visual stimuli. Acoustic and visual signals were either congruent or spatially and temporally disparate at motion offset. (i) Crossmodal interactions were studied in a selective attention task. Subjects were instructed to attend to either the acoustic or the visual stream and to indicate the perceived final position of motion. (ii) Multisensory integration was studied in a divided attention task in which subjects were asked to report whether they perceived unified or separated audio-visual motion offsets. The results indicate that crossmodal interactions in motion perception do not depend on the integration of the audio-visual stream. Furthermore, in the crossmodal task, both visual and auditory motion perception were susceptible to modulation by irrelevant streams, provided that temporal disparities did not exceed a critical range. Concurrent visual streams modulated auditory motion perception in the central field, whereas concurrent acoustic streams attracted visual motion information in the periphery. Differential abilities between the visual and auditory system when attempting to accurately track positional information along different trajectories account for the observed biasing effects.
- Published
- 2012