8 results on '"A. John Van Opstal"'
Search Results
2. Stable bottom-up processing during dynamic top-down modulations in monkey auditory cortex
- Author
-
Huib Versnel, Sigrid M. C. I. Van Wetter, Marc M. Van Wanrooij, Roohollah Massoudi, and A. John Van Opstal
- Subjects
Auditory perception ,Auditory Cortex ,Male ,Neurons ,General Neuroscience ,Biophysics ,DCN PAC - Perception action and control ,Auditory cortex ,Macaca mulatta ,Electrophysiology ,medicine.anatomical_structure ,Acoustic Stimulation ,Receptive field ,medicine ,Auditory Perception ,Reaction Time ,Animals ,Attention ,Neuron ,Psychology ,Neuroscience - Abstract
Contains fulltext : 117242.pdf (Publisher’s version ) (Closed access) It is unclear whether top-down processing in the auditory cortex (AC) interferes with its bottom-up analysis of sound. Recent studies indicated non-acoustic modulations of AC responses, and that attention changes a neuron's spectrotemporal tuning. As a result, the AC would seem ill-suited to represent a stable acoustic environment, which is deemed crucial for auditory perception. To assess whether top-down signals influence acoustic tuning in tasks without directed attention, we compared monkey single-unit AC responses to dynamic spectrotemporal sounds under different behavioral conditions. Recordings were mostly made from neurons located in primary fields (primary AC and area R of the AC) that were well tuned to pure tones, with short onset latencies. We demonstrated that responses in the AC were substantially modulated during an auditory detection task and that these modulations were systematically related to top-down processes. Importantly, despite these significant modulations, the spectrotemporal receptive fields of all neurons remained remarkably stable. Our results suggest multiplexed encoding of bottom-up acoustic and top-down task-related signals at single AC neurons. This mechanism preserves a stable representation of the acoustic environment despite strong non-acoustic modulations. 13 p.
- Published
- 2013
3. Absence of compensation for vestibular-evoked passive head rotations in human sound localization
- Author
-
Denise C. P. B. M. Van Barneveld, A. John Van Opstal, and Floor Binkhorst
- Subjects
Physics ,Sound localization ,Vestibular system ,medicine.medical_specialty ,General Neuroscience ,Acoustics ,Eye movement ,Audiology ,Rotation ,Gaze ,Vestibular nystagmus ,Motor system ,medicine ,Loudspeaker - Abstract
A world-fixed sound presented to a moving head produces changing sound-localization cues, from which the audiomotor system could infer sound movement relative to the head. When appropriately combined with self-motion signals, sound localization remains spatially accurate. Indeed, free-field orienting responses fully incorporate intervening eye-head movements under open-loop localization conditions. Here we investigate the default strategy of the audiomotor system when localizing sounds in the absence of efferent and proprioceptive head-movement signals. Head- and body-restrained listeners made saccades in total darkness toward brief (3, 10 or 100 ms) broadband noise bursts, while being rotated sinusoidally (f=1/9 Hz, V(peak) =112 deg/s) around the vertical body axis. As the loudspeakers were attached to the chair, the 100 ms sounds might be perceived as rotating along with the chair, and localized in head-centred coordinates. During 3 and 10 ms stimuli, however, the amount of chair rotation remained well below the minimum audible movement angle. These brief sounds would therefore be perceived as stationary in space and, as in open-loop gaze orienting, expected to be localized in world-centred coordinates. Analysis of the saccades shows, however, that all stimuli were accurately localized on the basis of imposed acoustic cues, but remained in head-centred coordinates. These results suggest that, in the absence of motor planning, the audio motor system keeps sounds in head-centred coordinates when unsure about sound motion relative to the head. To that end, it ignores vestibular canal signals of passive-induced head rotation, but incorporates intervening eye displacements from vestibular nystagmus during the saccade-reaction time.
- Published
- 2011
- Full Text
- View/download PDF
4. Acquired prior knowledge modulates audiovisual integration
- Author
-
Marc M. Van Wanrooij, A. John Van Opstal, and Peter Bremen
- Subjects
Communication ,business.industry ,Computer science ,General Neuroscience ,Speech recognition ,Pointer (computer programming) ,Prior probability ,Multisensory integration ,Information needs ,Object (computer science) ,business ,Spatial analysis ,Event (probability theory) - Abstract
Orienting responses to audiovisual events in the environment can benefit markedly by the integration of visual and auditory spatial information. However, logically, audiovisual integration would only be considered successful for stimuli that are spatially and temporally aligned, as these would be emitted by a single object in space-time. As humans do not have prior knowledge about whether novel auditory and visual events do indeed emanate from the same object, such information needs to be extracted from a variety of sources. For example, expectation about alignment or misalignment could modulate the strength of multisensory integration. If evidence from previous trials would repeatedly favour aligned audiovisual inputs, the internal state might also assume alignment for the next trial, and hence react to a new audiovisual event as if it were aligned. To test for such a strategy, subjects oriented a head-fixed pointer as fast as possible to a visual flash that was consistently paired, though not always spatially aligned, with a co-occurring broadband sound. We varied the probability of audiovisual alignment between experiments. Reaction times were consistently lower in blocks containing only aligned audiovisual stimuli than in blocks also containing pseudorandomly presented spatially disparate stimuli. Results demonstrate dynamic updating of the subject's prior expectation of audiovisual congruency. We discuss a model of prior probability estimation to explain the results.
- Published
- 2010
- Full Text
- View/download PDF
5. Eye position determines audiovestibular integration during whole-body rotation
- Author
-
Denise C. P. B. M. Van Barneveld and A. John Van Opstal
- Subjects
Sound localization ,Adult ,Male ,genetic structures ,Rotation ,Acoustics ,media_common.quotation_subject ,Biophysics ,Illusion ,Median plane ,Orientation ,Perception and Action [DCN 1] ,medicine ,Humans ,Sound Localization ,Ocular Physiological Phenomena ,media_common ,Vestibular system ,Communication ,business.industry ,General Neuroscience ,Reflex, Vestibulo-Ocular ,Middle Aged ,Sagittal plane ,medicine.anatomical_structure ,Acoustic Stimulation ,Head Movements ,Fixation (visual) ,Visual Perception ,Female ,sense organs ,Vestibulo–ocular reflex ,business ,Psychology ,Binaural recording ,Photic Stimulation - Abstract
When a sound is presented in the free field at a location that remains fixed to the head during whole-body rotation in darkness, it is heard displaced in the direction opposing the rotation. This phenomenon is known as the audiogyral illusion. Consequently, the subjective auditory median plane (AMP) (the plane where the binaural difference cues for sound localization are perceived to be zero) shifts in the direction of body rotation. Recent experiments, however, have suggested opposite AMP results when using a fixation light that also moves with the head. Although in this condition the eyes remain stationary in the head, an ocular pursuit signal cancels the vestibulo-ocular reflex, which could induce an additional AMP shift. We tested whether the AMP is influenced by vestibular signals, eye position or eye velocity. We rotated subjects sinusoidally at different velocities, either in darkness or with a head-fixed fixation light, while they judged the laterality (left vs. right with respect to the midsagittal plane of the head) of broadband sounds presented over headphones. Subjects also performed the same task without vestibular stimulation while tracking a sinusoidally moving visual target, which mimicked the average eye-movement patterns of the vestibular experiments in darkness. Results show that whole-body rotation in darkness induces a shift of the AMP in the direction of body rotation. In contrast, we obtained no significant AMP change when a fixation light was used. The pursuit experiments showed a shift of the AMP in the direction of eccentric eye position but not at peak pursuit velocity. We therefore conclude that the vestibular-induced shift in average eye position underlies both the audiogyral illusion and the AMP shift.
- Published
- 2010
6. Involvement of the inferior colliculus in a binaural spatial illusion (commentary on rajala et al.)
- Author
-
John van Opstal
- Subjects
Male ,Cognitive science ,Inferior colliculus ,genetic structures ,General Neuroscience ,media_common.quotation_subject ,Illusion ,Biophysics ,Illusions ,Inferior Colliculi ,Article ,Animals ,Humans ,Female ,Sound Localization ,Psychology ,Binaural recording ,Cognitive psychology ,media_common - Abstract
Illusions are effective tools for the study of the neural mechanisms underlying perception because neural responses can be correlated to the physical properties of stimuli and the subject’s perceptions. The Franssen illusion (FI) is an auditory spatial illusion evoked by presenting a transient, abrupt tone and a slowly rising, sustained tone of the same frequency simultaneously on opposite sides of the subject. Perception of the FI consists of hearing a single sound, the sustained tone, on the side that the transient was presented. Both subcortical and cortical mechanisms for the FI have been proposed, but, to date, there is no direct evidence for either. The data show that humans and rhesus monkeys perceive the FI similarly. Recordings were taken from single units of the inferior colliculus in the monkey while they indicated the perceived location of sound sources with their gaze. The results show that the transient component of the Franssen stimulus, with a shorter first spike latency and higher discharge rate than the sustained tone, encodes the perception of sound location. Furthermore, the persistent erroneous perception of the sustained stimulus location is due to continued excitation of the same neurons, first activated by the transient, by the sustained stimulus without location information. These results demonstrate for the first time, on a trial-by-trial basis, a correlation between perception of an auditory spatial illusion and a subcortical physiological substrate.
- Published
- 2013
7. Acquired prior knowledge modulates audiovisual integration
- Author
-
Van Wanrooij, Marc M., primary, Bremen, Peter, additional, and John Van Opstal, A., additional
- Published
- 2010
- Full Text
- View/download PDF
8. Human sound‐localization behaviour after multiple changes in eye position
- Author
-
Van Grootel, Tom J., primary and John Van Opstal, A., additional
- Published
- 2009
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.