Back to Search
Start Over
Synchronizing to real events: Subjective audiovisual alignment scales with perceived auditory depth and speed of sound
- Source :
- Proceedings of the National Academy of Sciences. 102:2244-2247
- Publication Year :
- 2005
- Publisher :
- Proceedings of the National Academy of Sciences, 2005.
-
Abstract
- Because of the slow speed of sound relative to light, acoustic and visual signals from a distant event often will be received asynchronously. Here, using acoustic signals with a robust cue to sound source distance, we show that judgments of perceived temporal alignment with a visual marker depend on the depth simulated in the acoustic signal. For distant sounds, a large delay of sound relative to vision is required for the signals to be perceived as temporally aligned. For nearer sources, the time lag corresponding to audiovisual alignment is smaller and scales at rate approximating the speed of sound. Thus, when robust cues to auditory distance are present, the brain can synchronize disparate audiovisual signals to external events despite considerable differences in time of arrival at the perceiver. This ability is functionally important as it allows auditory and visual signals to be synchronized to the external event that caused them.
- Subjects :
- Visual marker
geography
Multidisciplinary
geography.geographical_feature_category
Computer science
Event (computing)
Distance Perception
Speech recognition
Synchronizing
Biological Sciences
Signal
Time of arrival
Speed of sound
Time Perception
Auditory Perception
Visual Perception
Slow speed
Humans
Monte Carlo Method
Sound (geography)
Subjects
Details
- ISSN :
- 10916490 and 00278424
- Volume :
- 102
- Database :
- OpenAIRE
- Journal :
- Proceedings of the National Academy of Sciences
- Accession number :
- edsair.doi.dedup.....1b3cf554bb86f08dc58c379bb25e3347