174 results on '"Frank Bremmer"'
Search Results
2. Integration of landmark and saccade target signals in macaque frontal cortex visual responses
- Author
-
Adrian Schütz, Vishal Bharmauria, Xiaogang Yan, Hongying Wang, Frank Bremmer, and J. Douglas Crawford
- Subjects
Biology (General) ,QH301-705.5 - Abstract
Abstract Visual landmarks influence spatial cognition and behavior, but their influence on visual codes for action is poorly understood. Here, we test landmark influence on the visual response to saccade targets recorded from 312 frontal and 256 supplementary eye field neurons in rhesus macaques. Visual response fields are characterized by recording neural responses to various target-landmark combinations, and then we test against several candidate spatial models. Overall, frontal/supplementary eye fields response fields preferentially code either saccade targets (40%/40%) or landmarks (30%/4.5%) in gaze fixation-centered coordinates, but most cells show multiplexed target-landmark coding within intermediate reference frames (between fixation-centered and landmark-centered). Further, these coding schemes interact: neurons with near-equal target and landmark coding show the biggest shift from fixation-centered toward landmark-centered target coding. These data show that landmark information is preserved and influences target coding in prefrontal visual responses, likely to stabilize movement goals in the presence of noisy egocentric signals.
- Published
- 2023
- Full Text
- View/download PDF
3. Neural correlates of visual and tactile path integration and their task related modulation
- Author
-
Lisa Rosenblum, Alexander Kreß, B. Ezgi Arikan, Benjamin Straube, and Frank Bremmer
- Subjects
Medicine ,Science - Abstract
Abstract Self-motion induces sensory signals that allow to determine travel distance (path integration). For veridical path integration, one must distinguish self-generated from externally induced sensory signals. Predictive coding has been suggested to attenuate self-induced sensory responses, while task relevance can reverse the attenuating effect of prediction. But how is self-motion processing affected by prediction and task demands, and do effects generalize across senses? In this fMRI study, we investigated visual and tactile self-motion processing and its modulation by task demands. Visual stimuli simulated forward self-motion across a ground plane. Tactile self-motion stimuli were delivered by airflow across the subjects’ forehead. In one task, subjects replicated a previously observed distance (Reproduction/Active; high behavioral demand) of passive self-displacement (Reproduction/Passive). In a second task, subjects travelled a self-chosen distance (Self/Active; low behavioral demand) which was recorded and played back to them (Self/Passive). For both tasks and sensory modalities, Active as compared to Passive trials showed enhancement in early visual areas and suppression in higher order areas of the inferior parietal lobule (IPL). Contrasting high and low demanding active trials yielded supramodal enhancement in the anterior insula. Suppression in the IPL suggests this area to be a comparator of sensory self-motion signals and predictions thereof.
- Published
- 2023
- Full Text
- View/download PDF
4. Spatial localization during open-loop smooth pursuit
- Author
-
Stefan Dowiasch, Marius Blanke, Jonas Knöll, and Frank Bremmer
- Subjects
smooth eye movements ,smooth pursuit ,localization ,open-loop eye movement ,open-loop SPEM ,localization error ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
IntroductionNumerous previous studies have shown that eye movements induce errors in the localization of briefly flashed stimuli. Remarkably, the error pattern is indicative of the underlying eye movement and the exact experimental condition. For smooth pursuit eye movements (SPEM) and the slow phase of the optokinetic nystagmus (OKN), perceived stimulus locations are shifted in the direction of the ongoing eye movement, with a hemifield asymmetry observed only during SPEM. During the slow phases of the optokinetic afternystagmus (OKAN), however, the error pattern can be described as a perceptual expansion of space. Different from SPEM and OKN, the OKAN is an open-loop eye movement.MethodsVisually guided smooth pursuit can be transformed into an open–loop eye movement by briefly blanking the pursuit target (gap). Here, we examined flash localization during open-loop pursuit and asked, whether localization is also prone to errors and whether these are similar to those found during SPEM or during OKAN. Human subjects tracked a pursuit target. In half of the trials, the target was extinguished for 300 ms (gap) during the steady–state, inducing open–loop pursuit. Flashes were presented during this gap or during steady–state (closed–loop) pursuit.ResultsIn both conditions, perceived flash locations were shifted in the direction of the eye movement. The overall error pattern was very similar with error size being slightly smaller in the gap condition. The differences between errors in the open- and closed-loop conditions were largest in the central visual field and smallest in the periphery.DiscussionWe discuss the findings in light of the neural substrates driving the different forms of eye movements.
- Published
- 2023
- Full Text
- View/download PDF
5. High (130 Hz)- and mid (60 Hz)-frequency deep brain stimulation in the subthalamic nucleus differentially modulate response inhibition: A preliminary combined EEG and eye tracking study
- Author
-
Josefine Waldthaler, Alexander Sperlich, Aylin König, Charlotte Stüssel, Frank Bremmer, Lars Timmermann, and David Pedrosa
- Subjects
Antisaccade ,Response inhibition ,Parkinson’s disease ,Executive functions ,Eye tracking ,Movement disorders ,Computer applications to medicine. Medical informatics ,R858-859.7 ,Neurology. Diseases of the nervous system ,RC346-429 - Abstract
While deep brain stimulation (DBS) in the subthalamic nucleus (STN) improves motor functions in Parkinson’s disease (PD), it may also increase impulsivity by interfering with the inhibition of reflexive responses. The aim of this study was to investigate if varying the pulse frequency of STN-DBS has a modulating effect on response inhibition and its neural correlates.For this purpose, 14 persons with PD repeated an antisaccade task in three stimulation settings (DBS off, high-frequency DBS (130 Hz), mid-frequency DBS (60 Hz)) in a randomized order, while eye movements and brain activity via high-density EEG were recorded.On a behavioral level, 130 Hz DBS stimulation had no effect on response inhibition measured as antisaccade error rate, while 60 Hz DBS induced a slight but significant reduction of directional errors compared with the DBS-off state and 130 Hz DBS. Further, stimulation with both frequencies decreased the onset latency of correct antisaccades, while increasing the latency of directional errors.Time-frequency domain analysis of the EEG data revealed that 60 Hz DBS was associated with an increase in preparatory theta power over a midfrontal region of interest compared with the off-DBS state which is generally regarded as a marker of increased cognitive control. While no significant differences in brain activity over mid- and lateral prefrontal regions of interest emerged between the 60 Hz and 130 Hz conditions, both stimulation frequencies were associated with a stronger midfrontal beta desynchronization during the mental preparation for correct antisaccades compared with DBS off-state which is discussed in the context of potentially enhanced proactive recruitment of the oculomotor network.Our preliminary findings suggest that mid-frequency STN-DBS may provide beneficial effects on response inhibition, while both 130 Hz- and 60 Hz STN-DBS may promote voluntary actions at the expense of slower reflexive responses.
- Published
- 2023
- Full Text
- View/download PDF
6. Visual Perturbation Suggests Increased Effort to Maintain Balance in Early Stages of Parkinson’s to be an Effect of Age Rather Than Disease
- Author
-
Justus Student, David Engel, Lars Timmermann, Frank Bremmer, and Josefine Waldthaler
- Subjects
Parkinson’s disease ,body sway ,virtual reality ,center of mass (CoM) ,center of pressure (CoP) ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Postural instability marks a prevalent symptom of Parkinson’s disease (PD). It often manifests in increased body sway, which is commonly assessed by tracking the Center of Pressure (CoP). Yet, in terms of postural control, the body’s Center of Mass (CoM), and not CoP is what is regulated in a gravitational field. The aim of this study was to explore the effect of early- to mid-stage PD on these measures of postural control in response to unpredictable visual perturbations. We investigated three cohorts: (i) 18 patients with early to mid-stage PD [Hoehn & Yahr stage (1–3); 1.94 ± 0.70]; (ii) a group of 15 age-matched controls (ECT); and (iii) a group of 12 young healthy adults (YCT). Participants stood on a force plate to track their CoP, while the movement of their entire body was recorded with a video-based motion tracking system to monitor their CoM. A moving room paradigm was applied through a head-mounted virtual reality headset. The stimulus consisted of a virtual tunnel that stretched in the anterior-posterior direction which either remained static or moved back and forth in an unpredictable fashion.We found differences in mean sway amplitude (MSA) and mean velocities of CoP and CoM between the groups under both conditions, with higher MSA of CoP and CoM for PD and higher mean velocities of both variables for PD and ECT when compared with YCT. Visual perturbation increased mean CoP velocity in all groups but did not have effects on mean CoM velocity or MSA. While being significantly lower for the young adults, the net effect of visual perturbation on mean CoP velocity was similar between patients with PD and age-matched controls. There was no effect of the visual perturbation on mean CoM velocity for any of the groups.Our simultaneous assessment of CoP and CoM revealed that postural control is reflected differently in CoM and CoP. As the motion of CoM remained mostly unaffected, all groups successfully counteracted the perturbation and maintained their balance. Higher CoP velocity for PD and ECT revealed increased corrective motion needed to achieve this, which however was similar in both groups. Thus, our results suggest increased effort, expressed in CoP velocity, to be an effect of age rather than disease in earlier stages of PD.
- Published
- 2022
- Full Text
- View/download PDF
7. Preattentive and Predictive Processing of Visual Motion
- Author
-
Constanze Schmitt, Steffen Klingenhoefer, and Frank Bremmer
- Subjects
Medicine ,Science - Abstract
Abstract Interaction with the environment requires fast and reliable sensory processing. The visual system is confronted with a continuous flow of high-dimensional input (e.g. orientation, color, motion). From a theoretical point of view, it would be advantageous if critical information was processed independent of attentional load, i.e. preattentively. Here, we hypothesized that visual motion is such a critical signal and aimed for a neural signature of its preattentive encoding. Furthermore, we were interested in the neural correlates of predictability of linear motion trajectories based on the presence or absence of preceding motion. We presented a visual oddball paradigm and studied event-related potentials (ERPs). Stimuli were linearly moving Gabor patches that disappeared behind an occluder. The difference between deviant and standard trials was a trajectory change which happened behind the occluder in deviant trials only, inducing a prediction error. As hypothesized, we found a visual mismatch negativity-component over parietal and occipital electrodes. In a further condition, trials without preceding motion were presented in which the patch just appeared from behind the occluder and, hence, was not predictable. We found larger ERP-components for unpredictable stimuli. In summary, our results provide evidence for a preattentive and predictive processing of linear trajectories of visual motion.
- Published
- 2018
- Full Text
- View/download PDF
8. Heading representations in primates are compressed by saccades
- Author
-
Frank Bremmer, Jan Churan, and Markus Lappe
- Subjects
Science - Abstract
Macaque higher visual areas MST and VIP encode heading direction based on self-motion stimuli. Here the authors show that, while making saccades, the heading direction decoded from the neural responses is compressed toward straight-ahead, and independently demonstrate a perceptual illusion in humans based on this perisaccadic decoding error.
- Published
- 2017
- Full Text
- View/download PDF
9. Comparison of the precision of smooth pursuit in humans and head unrestrained monkeys
- Author
-
Jan Churan, Doris I. Braun, Karl R. Gegenfurtner, and Frank Bremmer
- Subjects
Eye movement ,eye tracking ,saccades ,smooth pursuit ,non-human primates ,head unrestrained ,Human anatomy ,QM1-695 - Abstract
Direct comparison of results of humans and monkeys is often complicated by differences in experimental conditions. We replicated in head unrestrained macaques experiments of a recent study comparing human directional precision during smooth pursuit eye movements (SPEM) and saccades to moving targets (Braun & Gegenfurtner, 2016). Directional precision of human SPEM follows an exponential decay function reaching optimal values of 1.5°-3° within 300 ms after target motion onset, whereas precision of initial saccades to moving targets is slightly better. As in humans, we found general agreement in the development of directional precision of SPEM over time and in the differences between directional precision of initial saccades and SPEM initiation. However, monkeys showed overall lower precision in SPEM compared to humans. This was most likely due to differences in experimental conditions, such as in the stabilization of the head, which was by a chin and a head rest in human subjects and unrestrained in monkeys.
- Published
- 2018
- Full Text
- View/download PDF
10. Decoding target distance and saccade amplitude from population activity in the macaque lateral intraparietal area (LIP)
- Author
-
Frank Bremmer, Andre Kaminiarz, Steffen Klingenhoefer, and Jan Churan
- Subjects
monkey ,Decoding ,saccade ,area LIP ,smooth pursuit ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 ,Neurology. Diseases of the nervous system ,RC346-429 - Abstract
Primates perform saccadic eye movements in order to bring the image of an interesting target onto the fovea. Compared to stationary targets, saccades towards moving targets are computationally more demanding since the oculomotor system must use speed and direction information about the target as well as knowledge about its own processing latency to program an adequate, predictive saccade vector. In monkeys, different brain regions have been implicated in the control of voluntary saccades, among them the lateral intraparietal area (LIP). Here we asked, if activity in area LIP reflects the distance between fovea and saccade target, or the amplitude of an upcoming saccade, or both. We recorded single unit activity in area LIP of two macaque monkeys. First, we determined for each neuron its preferred saccade direction. Then, monkeys performed visually guided saccades along the preferred direction towards either stationary or moving targets in pseudo-randomized order. LIP population activity allowed to decode both, the distance between fovea and saccade target as well as the size of an upcoming saccade. Previous work has shown comparable results for saccade direction (Graf and Andersen, 2014a, b). Hence, LIP population activity allows to predict any two-dimensional saccade vector. Functional equivalents of macaque area LIP have been identified in humans. Accordingly, our results provide further support for the concept of activity from area LIP as neural basis for the control of an oculomotor brain-machine interface.
- Published
- 2016
- Full Text
- View/download PDF
11. Saccadic compression of symbolic numerical magnitude.
- Author
-
Paola Binda, M Concetta Morrone, and Frank Bremmer
- Subjects
Medicine ,Science - Abstract
Stimuli flashed briefly around the time of saccadic eye movements are subject to complex distortions: compression of space and time; underestimate of numerosity. Here we show that saccadic distortions extend to abstract quantities, affecting the representation of symbolic numerical magnitude. Subjects consistently underestimated the results of rapidly computed mental additions and subtractions, when the operands were briefly displayed before a saccade. However, the recognition of the number symbols was unimpaired. These results are consistent with the hypothesis of a common, abstract metric encoding magnitude along multiple dimensions. They suggest that a surprising link exists between the preparation of action and the representation of abstract quantities.
- Published
- 2012
- Full Text
- View/download PDF
12. Visuo-tactile heading perception
- Author
-
Frank Bremmer, Lisa Rosenblum, Jakob Schwenk, and Alexander Kreß
- Subjects
Touch Perception ,Touch ,Physiology ,General Neuroscience ,Motion Perception ,Visual Perception ,Humans ,Optic Flow ,Vestibule, Labyrinth ,Photic Stimulation - Abstract
Self-motion through an environment induces various sensory signals, i.e., visual, vestibular, auditory, or tactile. Numerous studies have investigated the role of visual and vestibular stimulation for the perception of self-motion direction (heading). Here, we investigated the rarely considered interaction of visual and tactile stimuli in heading perception. Participants were presented optic flow simulating forward self-motion across a horizontal ground plane (visual), airflow toward the participants' forehead (tactile), or both. In separate blocks of trials, participants indicated perceived heading from unimodal visual or tactile or bimodal sensory signals. In bimodal trials, presented headings were either spatially congruent or incongruent with a maximum offset between visual and tactile heading of 30°. To investigate the reference frame in which visuo-tactile heading is encoded, we varied head and eye orientation during presentation of the stimuli. Visual and tactile stimuli were designed to achieve comparable precision of heading reports between modalities. Nevertheless, in bimodal trials heading perception was dominated by the visual stimulus. A change of head orientation had no significant effect on perceived heading, whereas, surprisingly, a change in eye orientation affected tactile heading perception. Overall, we conclude that tactile flow is more important to heading perception than previously thought.
- Published
- 2022
13. The visual representation of space in the primate brain
- Author
-
Stefan Dowiasch, Andre Kaminiarz, and Frank Bremmer
- Subjects
Neurology ,Neurology (clinical) - Abstract
One of the major functions of our brain is to process spatial information and to make this information available to our motor systems to interact successfully with the environment. Numerous studies over the past decades and even centuries have investigated, how our central nervous system deals with this challenge. Spatial information can be derived from vision. We see, where the cup of coffee stands at the breakfast table or where the un-mute-button of our video-conference tool is. However, this is always just a snapshot, because the location of the projection of the cup or the un-mute-button shifts across the retina by each eye movement, i.e., 2–3 times per second. So, where exactly in space are objects located? And what signals guide self-motion and navigation through our environment? While also other sensory signals (vestibular, tactile, auditory, even smell) can help us localize objects in space and guide our navigation, here, we will focus on the dominant sense in primates: vision. We will review (i) how visual information is processed to eventually result in space perception, (ii) how this perception is modulated by action, especially eye movements, at the behavioral and at the neural level, and (iii) how spatial representations relate to other encodings of magnitude, i.e., time and number.
- Published
- 2022
14. Convolutional neural network reveals frequency content of medio-lateral COM body sway to be highly predictive of Parkinson's disease
- Author
-
David Engel, Reinhard Stefan Greulich, Alberto Parola, Kaleb Vinehout, Stefan Dowiasch, Josefine Waldthaler, Lars Timmermann, Constantin A. Rothkopf, and Frank Bremmer
- Abstract
Postural instability as a symptom of progressing Parkinson's disease (PD) greatly reduces quality of life. Hence, early detection of postural impairments is crucial to facilitate interventions. Our aim was to use a convolutional neural network (CNN) to differentiate people with early to mid-stage PD from healthy age-matched individuals based on spectrogram images obtained from their body movement. We hypothesized the time-frequency content of body sway to be predictive of PD, even when impairments are not yet manifested in day-to-day postural control. We tracked their center of pressure (COP) using a Wii Balance Board and their full-body motion using a Microsoft Kinect, out of which we calculated the trajectory of their center of mass (COM). We used 30 s-snippets of motion data from which we acquired wavelet-based time-frequency spectrograms that were fed into a custom-built CNN as labeled images. We used binary classification to have the network differentiate between individuals with PD and controls (n=15, respectively). Classification performance was best when the medio-lateral motion of the COM was considered. Here, our network reached an average predictive accuracy of 98.45 % with a receiver operating characteristic area under the curve of 1.0. Moreover, an explainable AI approach revealed high frequencies in the postural sway data to be most distinct between both groups. Our findings suggest a CNN classifier based on cost-effective and conveniently obtainable posturographic data to be a promising approach to detect postural impairments in early to mid-stage PD and to gain novel insight into the subtle characteristics of impairments at this stage of the disease.
- Published
- 2023
15. Models of vision need some action
- Author
-
Constantin A. Rothkopf, Frank Bremmer, Katja Fiehler, Katharina Dobs, and Jochen Triesch
- Abstract
Bowers et al. focus their criticisms on research that compares behavioral and brain data from the ventral stream with a class of deep neural networks for object recognition. While they are right to identify issues with current benchmarking research programs, they overlook a much more fundamental limitation of this literature: disregarding the importance of action and interaction for perception.
- Published
- 2023
16. Visual perturbation of balance suggests impaired motor control but intact visuomotor processing in Parkinson’s disease
- Author
-
David W. Engel, Frank Bremmer, Adam P. Morris, Justus Student, Lars Timmermann, Josefine Waldthaler, and Jakob C. B. Schwenk
- Subjects
Adult ,Male ,Parkinson's disease ,genetic structures ,Physiology ,Disease ,050105 experimental psychology ,Phase locking ,Postural control ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,Humans ,Medicine ,0501 psychology and cognitive sciences ,Postural Balance ,Aged ,Balance (ability) ,business.industry ,General Neuroscience ,05 social sciences ,Motor control ,Parkinson Disease ,Middle Aged ,medicine.disease ,Body sway ,Visual Perception ,Female ,business ,Neuroscience ,Psychomotor Performance ,030217 neurology & neurosurgery - Abstract
Postural instability marks one of the most disabling features of Parkinson's disease (PD), but it only reveals itself after affected brain areas have already been significantly damaged. Thus there is a need to detect deviations in balance and postural control before visible symptoms occur. In this study, we visually perturbed balance in the anterior-posterior direction using sinusoidal oscillations of a moving room in virtual reality at different frequencies. We tested three groups: individuals with PD under dopaminergic medication, an age-matched control group, and a group of young healthy adults. We tracked their center of pressure and their full-body motion, from which we also extracted the center of mass. We investigated sway amplitudes and applied newly introduced phase-locking analyses to investigate responses across participants' bodies. Patients exhibited significantly higher sway amplitudes as compared with the control subjects. However, their sway was phase locked to the visual motion like that of age-matched and young healthy adults. Furthermore, all groups successfully compensated for the visual perturbation by phase locking their sway to the stimulus. As frequency of the perturbation increased, distribution of phase locking (PL) across the body revealed a shift of the highest PL values from the upper body toward the hip region for young healthy adults, which could not be observed in patients and elderly healthy adults. Our findings suggest an impaired motor control, but intact visuomotor processing in early stages of PD, while less flexibility to adapt postural strategy to different perturbations revealed to be an effect of age rather than disease.
- Published
- 2021
17. Multi-segment phase coupling to oscillatory visual drive
- Author
-
Frank Bremmer, Adam P. Morris, Jakob C. B. Schwenk, David W. Engel, and Adrian Schütz
- Subjects
Adult ,Male ,030506 rehabilitation ,Visual perception ,Computer science ,Motion Perception ,Biophysics ,Phase (waves) ,Stimulus (physiology) ,Motion (physics) ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,Match moving ,Control theory ,medicine ,Humans ,Orthopedics and Sports Medicine ,Postural Balance ,Balance (ability) ,Rehabilitation ,Virtual Reality ,Torso ,Adaptation, Physiological ,Healthy Volunteers ,medicine.anatomical_structure ,Female ,Percept ,0305 other medical science ,030217 neurology & neurosurgery - Abstract
Background It has been shown that humans adapt their postural sway to oscillatory, visually simulated self-motion. However, little is still known about the way individual body segments contribute to this adjustment of body sway and how this contribution varies with different environmental conditions. Research question How do the centre of pressure (COP) and individual body segments phase-lock to a sinusoidal visual drive depending on the frequency of stimulation? Methods In this study, we introduce phase coupling as a method for assessing full body motion in response to visual stimuli presented in virtual reality (VR). 12 participants (mean age: 31 ± 9 years) stood inside a virtual tunnel which oscillated sinusoidally in the anterior-posterior direction at a frequency of 0.2 Hz, 0.8 Hz or 1.2 Hz. Primary outcome measures were the trajectories of their COP as well as of 25 body segments obtained by a motion tracking system. Results Subjects significantly coupled the phase of their COP and body segments to the visual drive. Our analysis yielded significant phase coupling of the COP to the stimulus for all tested frequencies. The phase coupling of body segments revealed a shift in postural response as a function of frequency. At the low frequency of 0.2 Hz, we found strong and significant phase coupling homogeneously distributed across the body. At the higher frequencies of 0.8 Hz and 1.2 Hz, however, overall phase coupling became weaker and was centred around the lower torso and hip segments. Significance Information on how the visual percept of self-motion affects balance control is crucial for understanding visuomotor processing in health and disease. Our setup and methods constitute a reliable tool for assessing perturbed balance control, which can be utilized in future clinical trials.
- Published
- 2021
18. High- and Low-Frequency Deep Brain Stimulation in the Subthalamic Nucleus differentially modulate Response Inhibition and Action Selection in Parkinson’s Disease
- Author
-
Josefine Waldthaler, Alexander Sperlich, Aylin König, Charlotte Stüssel, Frank Bremmer, Lars Timmermann, and David Pedrosa
- Abstract
BackgroundWhile deep brain stimulation (DBS) in the subthalamic nucleus (STN) improves motor functions in Parkinson’s disease (PD), it has also been associated with increased impulsivity.MethodsA combined approach of eye-tracking and high-density EEG was used to investigate how high- and low-frequency DBS impact impulsive actions in the antisaccade task in a cohort of ten persons with PD. Computational modelling of the behavioral outcomes allowed a nuanced insight into the effect of DBS on response inhibition and action selection processes. Results: Against our expectations, both 130 Hz- and 60 Hz-DBS improved response inhibition as both resulted in a reduced rate of early reflexive errors. Correspondingly, DBS with both frequencies led to increased desynchronization of beta power during the preparatory period which may be a correlate of anticipatory activation in the oculomotor network.Low-frequency DBS additionally was associated with increased midfrontal theta power, an established marker of cognitive control. While higher midfrontal theta power predicted longer antisaccade latencies in off-DBS state on a trial-by-trial basis, 130 Hz-DBS reversed this relationship. As informed by the computational model, 130 Hz-DBS further led to a shift in the speed-accuracy trade-off causing an acceleration and error-proneness of actions later in the trial.ConclusionsOur results disentangle the impact of DBS on early and late impulsive actions. Only 130 Hz-DBS may disrupt theta-mediated cognitive control mechanisms via medial frontal – STN pathways that are involved in delaying action selection. 60 Hz-DBS may provide beneficial effects on response inhibition without the detrimental effect on action selection seen with 130 Hz-DBS.FundingThis study was supported by the SUCCESS program of Philipps-University Marburg (JW), the Hessian Ministry of Sciences and the Arts, clusterproject: The Adaptive Mind – TAM (FB / AK) and the German Research Foundation (DFG). International Research Training Group 1901 (FB / AK)
- Published
- 2022
19. Perisaccadic encoding of temporal information in macaque area V4
- Author
-
Frank Bremmer, Stefan Dowiasch, Jakob C. B. Schwenk, Björn-Olaf Werner, and Steffen Klingenhoefer
- Subjects
Male ,Physiology ,General Neuroscience ,Eye movement ,Biology ,Multi unit activity ,Macaca mulatta ,Macaque ,Saccadic masking ,biology.animal ,Encoding (memory) ,Time Perception ,Reaction Time ,Saccades ,Visual Perception ,Animals ,Macaca ,Neuroscience ,Temporal information ,Photic Stimulation ,Visual Cortex - Abstract
The accurate processing of temporal information is of critical importance in everyday life. Yet, psychophysical studies in humans have shown that the perception of time is distorted around saccadic eye movements. The neural correlates of this misperception are still poorly understood. Behavioral and neural evidence suggest that it is tightly linked to other known perisaccadic modulations of visual perception. To further our understanding of how temporal processing is affected by saccades, we studied the representations of brief visual time intervals during fixation and saccades in area V4 of two awake macaques. We presented random sequences of vertical bar stimuli and extracted neural responses to double-pulse stimulation at varying interstimulus intervals. Our results show that temporal information about very brief intervals of as brief as 20 ms is reliably represented in the multiunit activity in area V4. Response latencies were not systematically modulated by the saccade. However, a general increase in perisaccadic activity altered the ratio of response amplitudes within stimulus pairs compared with fixation. In line with previous studies showing that the perception of brief time intervals is partly based on response levels, this may be seen as a possible correlate of the perisaccadic misperception of time.
- Published
- 2021
20. Neural responses to broadband visual flicker in marmoset primary visual cortex
- Author
-
Jakob C. B. Schwenk, Maureen A. Hagan, Shaun L. Cloherty, Elizabeth Zavitz, Adam P. Morris, Nicholas S. C. Price, Marcello G. P. Rosa, and Frank Bremmer
- Subjects
genetic structures - Abstract
Temporal information is ubiquitous in natural vision and must be represented accurately in the brain to allow us to interact with a constantly changing world. Recent studies have employed a random stimulation paradigm to map the temporal response function (TRF) to luminance changes in the human EEG. This approach has revealed that the visual system, when presented with broadband visual input, actively selects distinct temporal frequencies, and retains their phase-information for prolonged periods of time. This non-linear response likely originates in primary visual cortex (V1), yet, so far it has not been investigated on a neural level. Here, we characterize the steady-state response to random broadband visual flicker in marmoset V1. In two experiments, we recorded from i) marmosets passively stimulated under general anesthesia, and ii) awake marmosets, under free viewing conditions. Our results show that LFP coupling to the stimulus was broadband and unselective under anesthesia, whereas in awake animals, it was restricted to two distinct frequency components, in the alpha and beta range. Within these frequency bands, coupling adhered to the receptive field (RF) boundaries of the local populations. The responses outside the RF did not provide evidence for a propagation of stimulus information across the cortex, contrary to results in human EEG studies. This result may be explained by short fixation durations, warranting further investigation. In summary, our findings show that during awake behavior V1 neural responses to broadband information are selective for distinct frequency bands, and that this selectivity is likely controlled actively by top-down mechanisms.
- Published
- 2022
21. Inter-trial phase coherence of visually evoked postural responses in virtual reality
- Author
-
David W. Engel, Milosz Krala, Adrian Schütz, Jakob C. B. Schwenk, Frank Bremmer, and Adam P. Morris
- Subjects
Adult ,medicine.medical_specialty ,genetic structures ,Headset ,Motion Perception ,Postural response ,Stimulus (physiology) ,Audiology ,Virtual reality ,Body sway ,050105 experimental psychology ,03 medical and health sciences ,0302 clinical medicine ,Center of pressure (terrestrial locomotion) ,medicine ,Humans ,0501 psychology and cognitive sciences ,Force platform ,Postural Balance ,Balance (ability) ,COP ,Phase coherence ,General Neuroscience ,05 social sciences ,Virtual Reality ,Frequency ,Trial Phase ,Psychology ,030217 neurology & neurosurgery ,Psychomotor Performance ,Research Article - Abstract
Vision plays a central role in maintaining balance. When humans perceive their body as moving, they trigger counter movements. This results in body sway, which has typically been investigated by measuring the body’s center of pressure (COP). Here, we aimed to induce visually evoked postural responses (VEPR) by simulating self-motion in virtual reality (VR) using a sinusoidally oscillating “moving room” paradigm. Ten healthy subjects participated in the experiment. Stimulation consisted of a 3D-cloud of random dots, presented through a VR headset, which oscillated sinusoidally in the anterior–posterior direction at different frequencies. We used a force platform to measure subjects’ COP over time and quantified the resulting trajectory by wavelet analyses including inter-trial phase coherence (ITPC). Subjects exhibited significant coupling of their COP to the respective stimulus. Even when spectral analysis of postural sway showed only small responses in the expected frequency bands (power), ITPC revealed an almost constant strength of coupling to the stimulus within but also across subjects and presented frequencies. Remarkably, ITPC even revealed a strong phase coupling to stimulation at 1.5 Hz, which exceeds the frequency range that has generally been attributed to the coupling of human postural sway to an oscillatory visual scenery. These findings suggest phase-locking to be an essential feature of visuomotor control.
- Published
- 2020
22. Influence of Tactile Flow on Visual Heading Perception
- Author
-
Lisa Rosenblum, Elisa Grewe, Jan Churan, and Frank Bremmer
- Subjects
Ophthalmology ,Cognitive Neuroscience ,Experimental and Cognitive Psychology ,Computer Vision and Pattern Recognition ,Sensory Systems - Abstract
The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual–vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer’s self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.
- Published
- 2021
23. Spatial Coding of Visual Targets in the Frontal and Supplementary Eye Fields
- Author
-
Vishal Bharmauria, Adrian Schütz, Xiaogang Yan, Hongying Wang, Frank Bremmer, and John Douglas Crawford
- Subjects
Ophthalmology ,Sensory Systems - Published
- 2022
24. Neural Signatures of Actively Controlled Self-Motion and the Subjective Encoding of Distance
- Author
-
Constanze Schmitt, Milosz Krala, and Frank Bremmer
- Subjects
General Neuroscience ,General Medicine - Abstract
Navigating through an environment requires knowledge about one’s direction of self-motion (heading) and traveled distance. Behavioral studies showed that human participants can actively reproduce a previously observed travel distance purely based on visual information. Here, we employed electroencephalography (EEG) to investigate the underlying neural processes. We measured, in human observers, event-related potentials (ERPs) during visually simulated straight-forward self-motion across a ground plane. The participants’ task was to reproduce (active condition) double the distance of a previously seen self-displacement (passive condition) using a gamepad. We recorded the trajectories of self-motion during the active condition and played it back to the participants in a third set of trials (replay condition). We analyzed EEG activity separately for four electrode clusters: frontal (F), central (C), parietal (P), and occipital (O). When aligned to self-motion onset or offset, response modulation of the ERPs was stronger, and several ERP components had different latencies in the passive as compared with the active condition. This result is in line with the concept of predictive coding, which implies modified neural activation for self-induced versus externally induced sensory stimulation. We aligned our data also to the times when subjects passed the (objective) single distance d_obj and the (subjective) single distance d_sub. Remarkably, wavelet-based temporal-frequency analyses revealed enhanced theta-band activation for F, P, and O-clusters shortly before passing d_sub. This enhanced activation could be indicative of a navigation related representation of subjective distance. More generally, our study design allows to investigate subjective perception without interfering neural activation because of the required response action.
- Published
- 2022
25. The encoding of saccadic eye movements within human posterior parietal cortex.
- Author
-
Christina S. Konen, Raimund Kleiser, Hans-Jörg Wittsack, Frank Bremmer, and Rüdiger J. Seitz
- Published
- 2004
- Full Text
- View/download PDF
26. Deconstructing the receptive field: Information coding in macaque area MST.
- Author
-
Bart Krekelberg, Monica Paolini, Frank Bremmer, Markus Lappe, and Klaus-Peter Hoffmann
- Published
- 2001
- Full Text
- View/download PDF
27. Space Coding in Primate Posterior Parietal Cortex.
- Author
-
Frank Bremmer, Anja Schlack, Jean-René Duhamel, Werner Graf, and Gereon R. Fink
- Published
- 2001
- Full Text
- View/download PDF
28. Visual perturbation of balance suggests impaired neuromuscular stability but intact visuo-motor control in Parkinson’s disease
- Author
-
David W. Engel, Frank Bremmer, Jakob C. B. Schwenk, Adam P. Morris, Justus Student, Lars Timmermann, and Josefine Waldthaler
- Subjects
medicine.medical_specialty ,Parkinson's disease ,Flexibility (anatomy) ,business.industry ,Dopaminergic ,Postural instability ,Motor control ,Stimulus (physiology) ,medicine.disease ,Control subjects ,medicine.anatomical_structure ,Physical medicine and rehabilitation ,medicine ,business ,Balance (ability) - Abstract
Postural instability marks one of the most disabling features of Parkinson’s disease (PD), but only reveals itself after affected brain areas have already been significantly damaged. Thus, there is a need to detect deviations in balance and postural control before visible symptoms occur. In this study, we visually perturbed balance in the anterior-posterior direction using sinusoidal oscillations of a moving room in virtual reality at different frequencies. We tested three groups: individuals with PD under dopaminergic medication, an age-matched control group, and a group of young healthy adults. We tracked their centre of pressure and their full-body motion. We investigated sway amplitudes and applied newly introduced phase-locking analyses to investigate responses across participants’ bodies. Patients exhibited significantly higher sway amplitudes as compared to the control subjects. However, their sway was phase-locked to the visual motion like that of age-matched and young healthy adults. Furthermore, all groups successfully compensated for the visual perturbation by – most likely reflexively - phase-locking their sway to the stimulus. As frequency of the perturbation increased, distribution of phase-locking (PL) across the body revealed a shift of the highest PL-values from the upper body towards the hip-region for young healthy adults, which could not be observed in patients and elderly healthy adults. Our findings suggest an impaired neuromuscular stability, but intact visuomotor processing in early stages of PD, while less flexibility to adapt postural strategy to different perturbations revealed to be an effect of age rather than disease.New & NoteworthyA better understanding of visuomotor control in Parkinson’s disease (PD) potentially serves as a tool for earlier diagnosis, which is crucial for improving patient’s quality of life. In our study, we assess body sway responses to visual perturbations of the balance control system in patients with early-to-mid stage PD, using motion tracking along with recently established phase-locking techniques. Our findings suggest patients at this stage to have an impaired muscular stability but intact visuomotor control.
- Published
- 2021
29. Coding of interceptive saccades in parietal cortex of macaque monkeys
- Author
-
Frank Bremmer, Jan Churan, Andre Kaminiarz, and Jakob C. B. Schwenk
- Subjects
Superior Colliculi ,Histology ,Computer science ,Posterior parietal cortex ,Macaque ,Parietal cortex ,Parietal Lobe ,biology.animal ,Saccades ,Animals ,biology ,General Neuroscience ,Superior colliculus ,Haplorhini ,Gaze ,Saccadic masking ,Electrophysiology ,Saccade ,Macaca ,Original Article ,Anatomy ,Neuroscience ,Photic Stimulation ,Visual motion ,Coding (social sciences) - Abstract
The oculomotor system can initiate remarkably accurate saccades towards moving targets (interceptive saccades) the processing of which is still under debate. The generation of these saccades requires the oculomotor centers to have information about the motion parameters of the target that then must be extrapolated to bridge the inherent processing delays. We investigated to what degree the information about motion of a saccade target is available in the lateral intra-parietal area (area LIP) of macaque monkeys for generation of accurate interceptive saccades. When a multi-layer neural network was trained based on neural discharges from area LIP around the time of saccades towards stationary targets, it was also able to predict the end points of saccades directed towards moving targets. This prediction, however, lagged behind the actual post-saccadic position of the moving target by ~ 80 ms when the whole neuronal sample of 105 neurons was used. We further found that single neurons differentially code for the motion of the target. Selecting neurons with the strongest representation of target motion reduced this lag to ~ 30 ms which represents the position of the moving target approximately at the onset of the interceptive saccade. We conclude that—similarly to recent findings from the Superior Colliculus (Goffart et al. J Neurophysiol 118(5):2890–2901)—there is a continuum of contributions of individual LIP neurons to the accuracy of interceptive saccades. A contribution of other gaze control centers (like the cerebellum or the frontal eye field) that further increase the saccadic accuracy is, however, likely. Supplementary Information The online version contains supplementary material available at 10.1007/s00429-021-02365-x.
- Published
- 2021
30. Gaze-Related Activity in Primate Frontal Cortex Predicts and Mitigates Spatial Uncertainty
- Author
-
Xiaogang Yan, Frank Bremmer, Bharmauria, John Douglas Crawford, Khoozani Pa, Schuetz A, and Hongying Wang
- Subjects
Frontal cortex ,Landmark ,biology ,Feature (computer vision) ,biology.animal ,Primate ,Psychology ,Prefrontal cortex ,Sensory cue ,Gaze ,Neuroscience ,Mirroring - Abstract
A remarkable feature of primate behavior is the ability to predict future events based on past experience and current sensory cues. To understand how the brain plans movements in the presence of unstable cues, we recorded gaze-related activity in the frontal cortex of two monkeys engaged in a quasi-predictable cue-conflict task. Animals were trained to look toward remembered visual targets in the presence of a landmark that shifted with fixed amplitude but randomized direction. As simulated by a probabilistic model based on known physiology/behavior, gaze end points assumed a circular distribution around the target, mirroring the possible directions of the landmark shift. This predictive strategy was reflected in frontal cortex activity (especially supplementary eye fields), which anticipated future gaze distributions before the actual landmark shift. In general, these results implicate prefrontal cortex in the predictive integration of environmental cues and their learned statistical properties to mitigate spatial uncertainty.
- Published
- 2021
31. Quantitative comparison of a mobile and a stationary video-based eye-tracker
- Author
-
Frank Bremmer, Peter Wolf, and Stefan Dowiasch
- Subjects
Eye Movements ,Computer science ,Eye movement analysis ,Experimental and Cognitive Psychology ,Sample (statistics) ,050105 experimental psychology ,Article ,Visual processing ,03 medical and health sciences ,0302 clinical medicine ,Arts and Humanities (miscellaneous) ,Developmental and Educational Psychology ,Psychophysics ,Quantitative comparative analysis ,Humans ,0501 psychology and cognitive sciences ,Computer vision ,Smooth-pursuit eye movement (SPEM) ,General Psychology ,Vision, Ocular ,Mobile eye-tracking ,business.industry ,05 social sciences ,Eye movement ,Gaze ,Pursuit, Smooth ,Face (geometry) ,Saccade ,Fixation eye movement ,Eye tracking ,Psychology (miscellaneous) ,Artificial intelligence ,Eye-tracking ,business ,Accuracy evaluation methods ,030217 neurology & neurosurgery ,Saccade eye movements - Abstract
Vision represents the most important sense of primates. To understand visual processing, various different methods are employed-for example, electrophysiology, psychophysics, or eye-tracking. For the latter method, researchers have recently begun to step outside the artificial environments of laboratory setups toward the more natural conditions we usually face in the real world. To get a better understanding of the advantages and limitations of modern mobile eye-trackers, we quantitatively compared one of the most advanced mobile eye-trackers available, the EyeSeeCam, with a commonly used laboratory eye-tracker, the EyeLink II, serving as a gold standard. We aimed to investigate whether or not fully mobile eye-trackers are capable of providing data that would be adequate for direct comparisons with data recorded by stationary eye-trackers. Therefore, we recorded three different, commonly used eye movements-fixations, saccades, and smooth-pursuit eye movements-with both eye-trackers, in successive standardized paradigms in a laboratory setting with eight human subjects. Despite major technical differences between the devices, most eye movement parameters were not statistically different between the two systems. Differences could only be found in overall gaze accuracy and for time-critical parameters such as saccade duration, for which a higher sample frequency is especially useful. Although the stationary EyeLink II system proved to be superior, especially on a single-subject or even a single-trial basis, the ESC showed similar performance for the averaged parameters across both trials and subjects. We concluded that modern mobile eye-trackers are well-suited to providing reliable oculomotor data at the required spatial and temporal resolutions.
- Published
- 2019
32. Comparison of the precision of smooth pursuit in humans and head unrestrained monkeys
- Author
-
Jan, Churan, Doris I, Braun, Karl R, Gegenfurtner, and Frank, Bremmer
- Subjects
Eye movement ,smooth pursuit ,non-human primates ,eye tracking ,saccades ,head unrestrained ,Research Article - Abstract
Direct comparison of results of humans and monkeys is often complicated by differences in experimental conditions. We replicated in head unrestrained macaques experiments of a recent study comparing human directional precision during smooth pursuit eye movements (SPEM) and saccades to moving targets (Braun & Gegenfurtner, 2016). Directional precision of human SPEM follows an exponential decay function reaching optimal values of 1.5°-3° within 300 ms after target motion onset, whereas precision of initial saccades to moving targets is slightly better. As in humans, we found general agreement in the devel-opment of directional precision of SPEM over time and in the differences between direc-tional precision of initial saccades and SPEM initiation. However, monkeys showed over-all lower precision in SPEM compared to humans. This was most likely due to differences in experimental conditions, such as in the stabilization of the head, which was by a chin and a head rest in human subjects and unrestrained in monkeys.
- Published
- 2021
33. Landmark-Centered Coding in Frontal Cortex Visual Responses
- Author
-
Frank Bremmer, Vishal Bharmaurisa, Adrian Schütz, Hongying Wang, Xiaogang Yan, and J. Douglas Crawford
- Subjects
Landmark ,Frontal cortex ,Computer science ,Visual space ,Spatial cognition ,Stimulus (physiology) ,Frontal eye fields ,Neuroscience ,Gaze ,Coding (social sciences) - Abstract
SummaryVisual landmarks influence spatial cognition [1–3], navigation [4,5] and goal-directed behavior [6–8], but their influence on visual coding in sensorimotor systems is poorly understood [6,9–11]. We hypothesized that visual responses in frontal cortex control gaze areas encode potential targets in an intermediate gaze-centered / landmark-centered reference frame that might depend on specific target-landmark configurations rather than a global mechanism. We tested this hypothesis by recording neural activity in the frontal eye fields (FEF) and supplementary eye fields (SEF) while head-unrestrained macaques engaged in a memory-delay gaze task. Visual response fields (the area of visual space where targets modulate activity) were tested for each neuron in the presence of a background landmark placed at one of four oblique configurations relative to the target stimulus. 102 of 312 FEF and 43 of 256 SEF neurons showed spatially tuned response fields in this task. We then fit these data against a mathematical continuum between a gaze-centered model and a landmark-centered model. When we pooled data across the entire dataset for each neuron, our response field fits did not deviate significantly from the gaze-centered model. However, when we fit response fields separately for each target-landmark configuration, the best fits shifted (mean 37% / 40%) toward landmark-centered coding in FEF / SEF respectively. This confirmed an intermediate gaze / landmark-centered mechanism dependent on local (configuration-dependent) interactions. Overall, these data show that external landmarks influence prefrontal visual responses, likely helping to stabilize gaze goals in the presence of variable eye and head orientations.HighlightsPrefrontal visual responses recorded in the presence of visual landmarksResponse fields showed intermediate gaze / landmark-centered organizationThis influence depended on specific target-landmark configurations
- Published
- 2020
34. Dynamics of Visual Perceptual Echoes Following Short-Term Visual Deprivation
- Author
-
Jakob C. B. Schwenk, Frank Bremmer, Rufin VanRullen, Centre de recherche cerveau et cognition (CERCO), Institut des sciences du cerveau de Toulouse. (ISCT), Université Toulouse - Jean Jaurès (UT2J)-Université Toulouse III - Paul Sabatier (UT3), Université Fédérale Toulouse Midi-Pyrénées-Université Fédérale Toulouse Midi-Pyrénées-CHU Toulouse [Toulouse]-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS)-Université Toulouse - Jean Jaurès (UT2J)-Université Toulouse III - Paul Sabatier (UT3), Université Fédérale Toulouse Midi-Pyrénées-Université Fédérale Toulouse Midi-Pyrénées-CHU Toulouse [Toulouse]-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS)-Centre National de la Recherche Scientifique (CNRS), and ANR-19-NEUC-0004,OsCiDeep,US-France Research Proposal: Oscillatory processes for visual reasoning in deep neural networks(2019)
- Subjects
0301 basic medicine ,Visual perception ,alpha ,genetic structures ,media_common.quotation_subject ,rhythmic sampling ,Electroencephalography ,Stimulus (physiology) ,Visual processing ,03 medical and health sciences ,0302 clinical medicine ,Rhythm ,medicine ,Contrast (vision) ,EEG ,ComputingMilieux_MISCELLANEOUS ,General Environmental Science ,media_common ,Monocular ,medicine.diagnostic_test ,[SCCO.NEUR]Cognitive science/Neuroscience ,eye diseases ,short-term visual deprivation ,030104 developmental biology ,Visual cortex ,medicine.anatomical_structure ,General Earth and Planetary Sciences ,Original Article ,perceptual echoes ,Psychology ,Neuroscience ,030217 neurology & neurosurgery - Abstract
The visual impulse-response function to random input as measured by EEG is dominated by the perceptual echo, a reverberation of stimulus information in the alpha range believed to represent active rhythmic sampling. How this response is generated on a cortical level is unknown. To characterize the underlying mechanisms, we investigated the echoes’ dynamics following short-term visual deprivation, which is known to modify the excitation/inhibition balance in visual cortex. We subjected observers to 150 min of light deprivation (LD) and monocular contrast deprivation (MD). Perceptual echoes were measured by binocular and dichoptic stimulation, respectively, and compared with a baseline condition. Our results show that the echo response is enhanced after LD, but not affected in temporal frequency or spatial propagation. Consistent with previous studies, MD shifted early response (0–150 ms) amplitudes in favor of the deprived eye, but had no systematic effect on the echoes. Our findings demonstrate that the echoes’ synchrony scales with cortical excitability, adding to previous evidence that they represent active visual processing. Their insensitivity to modulation at the monocular level suggests they are generated by a larger region of visual cortex. Our study provides further insight into how mechanisms of rhythmic sampling are implemented in the visual system.
- Published
- 2020
35. Nonretinocentric localization of successively presented flashes during smooth pursuit eye movements
- Author
-
Stefan, Dowiasch, Sonia, Meyer-Stender, Steffen, Klingenhoefer, and Frank, Bremmer
- Subjects
Adult ,Male ,Motion Perception ,Fixation, Ocular ,localization error ,perception ,Pursuit, Smooth ,Retina ,Article ,reference frame ,successive flashes ,Judgment ,Young Adult ,smooth pursuit ,Visual Perception ,Humans ,Female ,Photic Stimulation - Abstract
Keeping track of objects in our environment across body and eye movements is essential for perceptual stability and localization of external objects. As of yet, it is largely unknown how this perceptual stability is achieved. A common behavioral approach to investigate potential neuronal mechanisms underlying spatial vision has been the presentation of one brief visual stimulus across eye movements. Here, we adopted this approach and aimed to determine the reference frame of the perceptual localization of two successively presented flashes during fixation and smooth pursuit eye movements (SPEMs). To this end, eccentric flashes with a stimulus onset asynchrony of zero or ± 200 ms had to be localized with respect to each other during fixation and SPEMs. The results were used to evaluate different models predicting the reference frame in which the spatial information is represented. First, we were able to reproduce the well-known effect of relative mislocalization during fixation. Second, smooth pursuit led to a characteristic relative mislocalization, different from that during fixation. A model assuming that relative localization takes place in a nonretinocentric reference frame described our data best. This suggests that the relative localization judgment is performed at a stage of visual processing in which retinal and nonretinal information is available.
- Published
- 2020
36. A Causal Role of Area hMST for Self-Motion Perception in Humans
- Author
-
Frank Bremmer, Bianca R. Baltaretu, Constanze Schmitt, and J. Douglas Crawford
- Subjects
0301 basic medicine ,medicine.medical_specialty ,Heading (navigation) ,genetic structures ,medicine.medical_treatment ,media_common.quotation_subject ,Audiology ,Stimulus (physiology) ,Macaque ,03 medical and health sciences ,0302 clinical medicine ,biology.animal ,Perception ,transcranial magnetic stimulation ,medicine ,General Environmental Science ,media_common ,heading ,self-motion ,biology ,medicine.diagnostic_test ,Self motion perception ,Medial superior temporal area ,visually guided navigation ,Transcranial magnetic stimulation ,030104 developmental biology ,General Earth and Planetary Sciences ,Original Article ,medial-superior-temporal area ,Functional magnetic resonance imaging ,Psychology ,030217 neurology & neurosurgery - Abstract
Previous studies in the macaque monkey have provided clear causal evidence for an involvement of the medial-superior-temporal area (MST) in the perception of self-motion. These studies also revealed an overrepresentation of contraversive heading. Human imaging studies have identified a functional equivalent (hMST) of macaque area MST. Yet, causal evidence of hMST in heading perception is lacking. We employed neuronavigated transcranial magnetic stimulation (TMS) to test for such a causal relationship. We expected TMS over hMST to induce increased perceptual variance (i.e., impaired precision), while leaving mean heading perception (accuracy) unaffected. We presented 8 human participants with an optic flow stimulus simulating forward self-motion across a ground plane in one of 3 directions. Participants indicated perceived heading. In 57% of the trials, TMS pulses were applied, temporally centered on self-motion onset. TMS stimulation site was either right-hemisphere hMST, identified by a functional magnetic resonance imaging (fMRI) localizer, or a control-area, just outside the fMRI localizer activation. As predicted, TMS over area hMST, but not over the control-area, increased response variance of perceived heading as compared with noTMS stimulation trials. As hypothesized, this effect was strongest for contraversive self-motion. These data provide a first causal evidence for a critical role of hMST in visually guided navigation.
- Published
- 2020
37. Preattentive processing of visually guided self-motion in humans and monkeys
- Author
-
Adrian Schütz, Frank Bremmer, Andre Kaminiarz, Constanze Schmitt, Jakob C. B. Schwenk, and Jan Churan
- Subjects
Heading (navigation) ,biology ,medicine.diagnostic_test ,General Neuroscience ,Visually guided ,Mismatch negativity ,Electroencephalography ,Sensory system ,Haplorhini ,Macaque ,Peak response ,biology.animal ,medicine ,Animals ,Humans ,Self motion ,Attention ,Psychology ,Evoked Potentials ,Neuroscience - Abstract
The visually-based control of self-motion is a challenging task, requiring - if needed - immediate adjustments to keep on track. Accordingly, it would appear advantageous if the processing of self-motion direction (heading) was predictive, thereby accelerating the encoding of unexpected changes, and un-impaired by attentional load. We tested this hypothesis by recording EEG in humans and macaque monkeys with similar experimental protocols. Subjects viewed a random dot pattern simulating self-motion across a ground plane in an oddball EEG paradigm. Standard and deviant trials differed only in their simulated heading direction (forward-left vs. forward-right). Event-related potentials (ERPs) were compared in order to test for the occurrence of a visual mismatch negativity (vMMN), a component that reflects preattentive and likely also predictive processing of sensory stimuli. Analysis of the ERPs revealed signatures of a prediction mismatch for deviant stimuli in both humans and monkeys. In humans, a MMN was observed starting 110 ms after self-motion onset. In monkeys, peak response amplitudes following deviant stimuli were enhanced compared to the standard already 100 ms after self-motion onset. We consider our results strong evidence for a preattentive processing of visual self-motion information in humans and monkeys, allowing for ultrafast adjustments of their heading direction.
- Published
- 2021
38. Influence of tactile flow on visual heading perception
- Author
-
Frank Bremmer, Lisa Rosenblum, Elisa Grewe, and Jan Churan
- Subjects
Ophthalmology ,Heading (navigation) ,Flow (mathematics) ,business.industry ,Perception ,media_common.quotation_subject ,Computer vision ,Artificial intelligence ,business ,Psychology ,Sensory Systems ,media_common - Published
- 2021
39. Decoding of visually guided and interceptive saccades from area LIP of macaque monkeys
- Author
-
Frank Bremmer, Andre Kaminiarz, Jakob C. B. Schwenk, and Jan Churan
- Subjects
Ophthalmology ,biology ,Computer science ,business.industry ,biology.animal ,Visually guided ,Computer vision ,Artificial intelligence ,business ,Macaque ,Sensory Systems ,Decoding methods - Published
- 2021
40. Pourquoi le monde reste-t-il stable quand nous bougeons les yeux ?
- Author
-
Frank Bremmer
- Published
- 2020
41. Predictive coding in a multisensory path integration task: An fMRI study
- Author
-
Milosz, Krala, Bianca, van Kemenade, Benjamin, Straube, Tilo, Kircher, and Frank, Bremmer
- Subjects
Adult ,Male ,Movement ,Motion Perception ,Sensation ,Fixation, Ocular ,Magnetic Resonance Imaging ,Frontal Lobe ,Young Adult ,Cognition ,Mental Processes ,Acoustic Stimulation ,Auditory Perception ,Humans ,Female ,Photic Stimulation - Abstract
During self-motion through an environment, our sensory systems are confronted with a constant flow of information from different modalities. To successfully navigate, self-induced sensory signals have to be dissociated from externally induced sensory signals. Previous studies have suggested that the processing of self-induced sensory information is modulated by means of predictive coding mechanisms. However, the neural correlates of processing self-induced sensory information from different modalities during self-motion are largely unknown. Here, we asked if and how the processing of visually simulated self-motion and/or associated auditory stimuli is modulated by self-controlled action. Participants were asked to actively reproduce a previously observed simulated self-displacement (path integration). Blood oxygen level-dependent (BOLD) activation during this path integration was compared with BOLD activation during a condition in which we passively replayed the exact sensory stimulus that had been produced by the participants in previous trials. We found supramodal BOLD suppression in parietal and frontal regions. Remarkably, BOLD contrast in sensory areas was enhanced in a modality-specific manner. We conclude that the effect of action on sensory processing is strictly dependent on the respective behavioral task and its relevance.
- Published
- 2019
42. Saccade-induced changes in ocular torsion reveal predictive orientation perception
- Author
-
T Scott, Murdison, Gunnar, Blohm, and Frank, Bremmer
- Subjects
Adult ,Male ,Young Adult ,Rotation ,Saccades ,Visual Perception ,Humans ,Female ,Fixation, Ocular ,Orientation, Spatial ,Retina ,Vision, Ocular - Abstract
Natural orienting of gaze often results in a retinal image that is rotated relative to space due to ocular torsion. However, we perceive neither this rotation nor a moving world despite visual rotational motion on the retina. This perceptual stability is often attributed to the phenomenon known as predictive remapping, but the current remapping literature ignores this torsional component. In addition, studies often simply measure remapping across either space or features (e.g., orientation) but in natural circumstances, both components are bound together for stable perception. One natural circumstance in which the perceptual system must account for the current and future eye orientation to correctly interpret the orientation of external stimuli occurs during movements to or from oblique eye orientations (i.e., eye orientations with both a horizontal and vertical angular component relative to the primary position). Here we took advantage of oblique eye orientation-induced ocular torsion to examine perisaccadic orientation perception. First, we found that orientation perception was largely predicted by the rotated retinal image. Second, we observed a presaccadic remapping of orientation perception consistent with maintaining a stable (but spatially inaccurate) retinocentric perception throughout the saccade. These findings strongly suggest that our seamless perceptual stability relies on retinocentric signals that are predictively remapped in all three ocular dimensions with each saccade.
- Published
- 2019
43. Neural correlate of spatial (mis‐)localization during smooth eye movements
- Author
-
Stefan Dowiasch, Frank Bremmer, and Gunnar Blohm
- Subjects
Visual perception ,eye‐position decoding ,genetic structures ,Computer science ,Cognitive Neuroscience ,Macaque ,Frame of reference ,050105 experimental psychology ,Smooth pursuit ,mislocalization ,03 medical and health sciences ,0302 clinical medicine ,biology.animal ,Saccades ,Animals ,0501 psychology and cognitive sciences ,Computer vision ,Attention ,Neurons ,Communication ,biology ,business.industry ,General Neuroscience ,05 social sciences ,Eye movement ,Brain ,macaque area VIP ,Saccadic masking ,eye diseases ,Pursuit, Smooth ,smooth pursuit ,Space Perception ,Fixation (visual) ,Evoked Potentials, Visual ,Macaca ,Artificial intelligence ,sense organs ,business ,030217 neurology & neurosurgery ,Decoding methods - Abstract
The dependence of neuronal discharge on the position of the eyes in the orbit is a functional characteristic of many visual cortical areas of the macaque. It has been suggested that these eye‐position signals provide relevant information for a coordinate transformation of visual signals into a non‐eye‐centered frame of reference. This transformation could be an integral part for achieving visual perceptual stability across eye movements. Previous studies demonstrated close to veridical eye‐position decoding during stable fixation as well as characteristic erroneous decoding across saccadic eye‐movements. Here we aimed to decode eye position during smooth pursuit. We recorded neural activity in macaque area VIP during steady fixation, saccades and smooth‐pursuit and investigated the temporal and spatial accuracy of eye position as decoded from the neuronal discharges. Confirming previous results, the activity of the majority of neurons depended linearly on horizontal and vertical eye position. The application of a previously introduced computational approach (isofrequency decoding) allowed eye position decoding with considerable accuracy during steady fixation. We applied the same decoder on the activity of the same neurons during smooth‐pursuit. On average, the decoded signal was leading the current eye position. A model combining this constant lead of the decoded eye position with a previously described attentional bias ahead of the pursuit target describes the asymmetric mislocalization pattern for briefly flashed stimuli during smooth pursuit eye movements as found in human behavioral studies.
- Published
- 2016
44. Multisensory Integration in Self Motion Perception
- Author
-
Mark W. Greenlee, Mariia Kaliuzhna, Frank Bremmer, Sebastian M. Frank, Andrew Smith, Paul R. MacNeilage, Jan Churan, Luigi F. Cuturi, and Olaf Blanke
- Subjects
Vestibular system ,Modalities ,genetic structures ,Cognitive Neuroscience ,05 social sciences ,Multisensory integration ,Eye movement ,Experimental and Cognitive Psychology ,Human brain ,Somatosensory system ,050105 experimental psychology ,Sensory Systems ,Motion (physics) ,03 medical and health sciences ,Ophthalmology ,0302 clinical medicine ,medicine.anatomical_structure ,Psychophysics ,medicine ,0501 psychology and cognitive sciences ,Computer Vision and Pattern Recognition ,Psychology ,Neuroscience ,030217 neurology & neurosurgery - Abstract
Self motion perception involves the integration of visual, vestibular, somatosensory and motor signals. This article reviews the findings from single unit electrophysiology, functional and structural magnetic resonance imaging and psychophysics to present an update on how the human and non-human primate brain integrates multisensory information to estimate one’s position and motion in space. The results indicate that there is a network of regions in the non-human primate and human brain that processes self motion cues from the different sense modalities.
- Published
- 2016
45. SNARC Effect in Different Effectors
- Author
-
Philipp N. Hesse, Katja Fiehler, and Frank Bremmer
- Subjects
Adult ,Male ,Poison control ,Experimental and Cognitive Psychology ,050105 experimental psychology ,Association ,Fingers ,Correlation ,Modality independence ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,Stimulus modality ,Artificial Intelligence ,Saccades ,Humans ,0501 psychology and cognitive sciences ,Statistical analysis ,Association (psychology) ,Communication ,Effector ,business.industry ,05 social sciences ,Mathematical Concepts ,Sensory Systems ,Ophthalmology ,Space Perception ,Saccade ,Arm ,Female ,Psychology ,business ,Neuroscience ,Psychomotor Performance ,030217 neurology & neurosurgery - Abstract
The SNARC (spatial numerical association of response codes) effect, indicating that subjects react faster to the left for small numbers and to the right for large numbers, is used as evidence for the idea that humans use space to organize number representations. Previous studies compared the SNARC effect across sensory modalities within participants and concluded modality independence. So far, it is unknown what sensory-to-motor mappings are involved in generating the SNARC effect and whether these mappings are identical for different effectors within subjects. Hence, we tested whether the SNARC effect is effector specific. Participants performed an auditory parity judgment task and responded with three different effectors: finger (button release), eyes (saccades), and arm (pointing). The SNARC effect occurred in each effector but varied in strength across the effectors. Across subjects, we found a significant correlation of SNARC strength for finger and arm responses suggesting the use of a shared sensory-to-motor mapping. SNARC strength did not correlate, however, between finger and eyes or arm and eyes. An additional statistical analysis based on conditional probabilities provided further evidence for SNARC-effector specificity. Taken together, our results suggest that the sensory-to-motor mapping is not as tight as it would be expected if the SNARC effect was effector independent.
- Published
- 2015
46. Eye movements during path integration
- Author
-
Jan, Churan, Anna, von Hopffgarten, and Frank, Bremmer
- Subjects
Feedback, Physiological ,Male ,genetic structures ,Eye Movements ,Optic Flow ,distance reproduction ,eye diseases ,self‐motion ,Young Adult ,Auditory Perception ,Humans ,Female ,visual ,sense organs ,Cognitive and Behavioural Neuroscience ,Sensory Neuroscience ,multi‐modal ,Auditory ,Psychomotor Performance ,Original Research - Abstract
Self‐motion induces spontaneous eye movements which serve the purpose of stabilizing the visual image on the retina. Previous studies have mainly focused on their reflexive nature and how the perceptual system disentangles visual flow components caused by eye movements and self‐motion. Here, we investigated the role of eye movements in distance reproduction (path integration). We used bimodal (visual‐auditory)‐simulated self‐motion: visual optic flow was paired with an auditory stimulus whose frequency was scaled with simulated speed. The task of the subjects in each trial was, first, to observe the simulated self‐motion over a certain distance (Encoding phase) and, second, to actively reproduce the observed distance using only visual, only auditory, or bimodal feedback (Reproduction phase). We found that eye positions and eye speeds were strongly correlated between the Encoding and the Reproduction phases. This was the case even when reproduction relied solely on auditory information and thus no visual stimulus was presented. We believe that these correlations are indicative of a contribution of eye movements to path integration.
- Published
- 2018
47. A Wireless, Bidirectional Interface for In Vivo Recording and Stimulation of Neural Activity in Freely Behaving Rats
- Author
-
Liana, Melo-Thomas, K-Alexander, Engelhardt, Uwe, Thomas, Dirk, Hoehl, Sascha, Thomas, Markus, Wöhr, Bjoern, Werner, Frank, Bremmer, and Rainer K W, Schwarting
- Subjects
Male ,Behavior ,telemetry ,wireless stimulation ,brain stimulation ,bi-directionality ,electrophysiology ,Electrodes, Implanted ,Rats ,Animals ,Issue 129 ,multichannel recordings ,wireless extracellular recording ,Rats, Wistar ,Wireless Technology - Abstract
In vivo electrophysiology is a powerful technique to investigate the relationship between brain activity and behavior at a millisecond and micrometer scale. However, current methods mostly rely on tethered cable recordings or only use unidirectional systems, allowing either recording or stimulation of neural activity, but not at the same time or same target. Here, a new wireless, bidirectional device for simultaneous multichannel recording and stimulation of neural activity in freely behaving rats is described. The system operates through a single portable head stage that both transmits recorded activity and can be targeted in real-time for brain stimulation using a telemetry-based multichannel software. The head stage is equipped with a preamplifier and a rechargeable battery, allowing stable long-term recordings or stimulation for up to 1 h. Importantly, the head stage is compact, weighs 12 g (including battery) and thus has minimal impact on the animal´s behavioral repertoire, making the method applicable to a broad set of behavioral tasks. Moreover, the method has the major advantage that the effect of brain stimulation on neural activity and behavior can be measured simultaneously, providing a tool to assess the causal relationships between specific brain activation patterns and behavior. This feature makes the method particularly valuable for the field of deep brain stimulation, allowing precise assessment, monitoring, and adjustment of stimulation parameters during long-term behavioral experiments. The applicability of the system has been validated using the inferior colliculus as a model structure.
- Published
- 2017
48. A Wireless, Bidirectional Interface for In Vivo Recording and Stimulation of Neural Activity in Freely Behaving Rats
- Author
-
Frank Bremmer, Dirk Hoehl, Bjoern Werner, Sascha Thomas, Liana Melo-Thomas, Rainer K.W. Schwarting, Uwe Thomas, Markus Wöhr, and K-Alexander Engelhardt
- Subjects
0301 basic medicine ,Inferior colliculus ,Deep brain stimulation ,General Immunology and Microbiology ,Computer science ,Brain activity and meditation ,medicine.medical_treatment ,Interface (computing) ,General Chemical Engineering ,General Neuroscience ,Stimulation ,General Biochemistry, Genetics and Molecular Biology ,03 medical and health sciences ,Electrophysiology ,030104 developmental biology ,0302 clinical medicine ,Brain stimulation ,Telemetry ,medicine ,Neuroscience ,030217 neurology & neurosurgery - Abstract
In vivo electrophysiology is a powerful technique to investigate the relationship between brain activity and behavior at a millisecond and micrometer scale. However, current methods mostly rely on tethered cable recordings or only use unidirectional systems, allowing either recording or stimulation of neural activity, but not at the same time or same target. Here, a new wireless, bidirectional device for simultaneous multichannel recording and stimulation of neural activity in freely behaving rats is described. The system operates through a single portable head stage that both transmits recorded activity and can be targeted in real-time for brain stimulation using a telemetry-based multichannel software. The head stage is equipped with a preamplifier and a rechargeable battery, allowing stable long-term recordings or stimulation for up to 1 h. Importantly, the head stage is compact, weighs 12 g (including battery) and thus has minimal impact on the animal´s behavioral repertoire, making the method applicable to a broad set of behavioral tasks. Moreover, the method has the major advantage that the effect of brain stimulation on neural activity and behavior can be measured simultaneously, providing a tool to assess the causal relationships between specific brain activation patterns and behavior. This feature makes the method particularly valuable for the field of deep brain stimulation, allowing precise assessment, monitoring, and adjustment of stimulation parameters during long-term behavioral experiments. The applicability of the system has been validated using the inferior colliculus as a model structure.
- Published
- 2017
49. Fronto-insula network activity explains emotional dysfunctions in juvenile myoclonic epilepsy: Combined evidence from pupillometry and fMRI
- Author
-
Frieder M. Paulus, Marcus Belke, Jens Sommer, Laura Müller-Pinzler, Frank Bremmer, Sören Krach, Wolfgang Einhäuser, Christine Roth, Andreas Jansen, Susanne Knake, Marius Blanke, Felix Rosenow, and Katja Menzler
- Subjects
Adult ,Male ,Ventrolateral prefrontal cortex ,Adolescent ,Brain activity and meditation ,Cognitive Neuroscience ,media_common.quotation_subject ,Clinical Neurology ,Experimental and Cognitive Psychology ,Empathy ,Neuropsychological Tests ,behavioral disciplines and activities ,Anterior cingulate cortex ,Young Adult ,Juvenile myoclonic epilepsy ,Image Processing, Computer-Assisted ,medicine ,Humans ,ddc:610 ,media_common ,Emotion ,Cerebral Cortex ,Brain Mapping ,medicine.diagnostic_test ,Secondary somatosensory cortex ,fMRI ,Myoclonic Epilepsy, Juvenile ,Magnetic Resonance Imaging ,Neuropsychology and Physiological Psychology ,medicine.anatomical_structure ,Pupillometry ,Neurology ,Female ,Psychology ,Functional magnetic resonance imaging ,Neuroscience ,psychological phenomena and processes ,Pain empathy - Abstract
Emotional instability, difficulties in social adjustment, and disinhibited behavior are the most common symptoms of the psychiatric comorbidities in juvenile myoclonic epilepsy (JME). This psychopathology has been associated with dysfunctions of mesial-frontal brain circuits. The present work is a first direct test of this link and adapted a paradigm for probing frontal circuits during empathy for pain. Neural and psychophysiological parameters of pain empathy were assessed by combining functional magnetic resonance imaging (fMRI) with simultaneous pupillometry in 15 JME patients and 15 matched healthy controls. In JME patients, we observed reduced neural activation of the anterior cingulate cortex (ACC), the anterior insula (AI), and the ventrolateral prefrontal cortex (VLPFC). This modulation was paralleled by reduced pupil dilation during empathy for pain in patients. At the same time, pupil dilation was positively related to neural activity of the ACC, AI, and VLPFC. In JME patients, the ACC additionally showed reduced functional connectivity with the primary and secondary somatosensory cortex, areas fundamentally implicated in processing the somatic cause of another's pain. Our results provide first evidence that alterations of mesial-frontal circuits directly affect psychosocial functioning in JME patients and draw a link of pupil dynamics with brain activity during emotional processing. The findings of reduced pain empathy related activation of the ACC and AI and aberrant functional integration of the ACC with somatosensory cortex areas provide further evidence for this network's role in social behavior and helps explaining the JME psychopathology and patients' difficulties in social adjustment.
- Published
- 2015
- Full Text
- View/download PDF
50. Visual selectivity for heading in the macaque ventral intraparietal area
- Author
-
Klaus-Peter Hoffmann, Frank Bremmer, Anja Schlack, Andre Kaminiarz, and Markus Lappe
- Subjects
Neurons ,Supplementary eye field ,Heading (navigation) ,Eye Movements ,genetic structures ,biology ,Physiology ,Computer science ,General Neuroscience ,Motion Perception ,Parietal lobe ,Eye movement ,Posterior parietal cortex ,Optic Flow ,Medial superior temporal area ,Macaca mulatta ,Macaque ,Orientation ,Parietal Lobe ,biology.animal ,Animals ,Motion perception ,Neuroscience ,Photic Stimulation - Abstract
The patterns of optic flow seen during self-motion can be used to determine the direction of one's own heading. Tracking eye movements which typically occur during everyday life alter this task since they add further retinal image motion and (predictably) distort the retinal flow pattern. Humans employ both visual and nonvisual (extraretinal) information to solve a heading task in such case. Likewise, it has been shown that neurons in the monkey medial superior temporal area (area MST) use both signals during the processing of self-motion information. In this article we report that neurons in the macaque ventral intraparietal area (area VIP) use visual information derived from the distorted flow patterns to encode heading during (simulated) eye movements. We recorded responses of VIP neurons to simple radial flow fields and to distorted flow fields that simulated self-motion plus eye movements. In 59% of the cases, cell responses compensated for the distortion and kept the same heading selectivity irrespective of different simulated eye movements. In addition, response modulations during real compared with simulated eye movements were smaller, being consistent with reafferent signaling involved in the processing of the visual consequences of eye movements in area VIP. We conclude that the motion selectivities found in area VIP, like those in area MST, provide a way to successfully analyze and use flow fields during self-motion and simultaneous tracking movements.
- Published
- 2014
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.