30 results on '"Jason S. Chan"'
Search Results
2. Severity of central sleep apnea does not affect sleeping oxygen saturation during ascent to high altitude
- Author
-
Gurkarn Saran, Jordan Bird, Mingma T. Sherpa, Anne Kalker, Garrick Chan, Trevor A. Day, Thomas D. Brutsaert, Jason S. Chan, Alexander N. Rimke, Nicholas G. Jendzjowsky, and Richard J. A. Wilson
- Subjects
medicine.medical_specialty ,Central sleep apnea ,Physiology ,business.industry ,Altitude ,Effects of high altitude on humans ,Affect (psychology) ,medicine.disease ,Sleep Apnea, Central ,Oxygen ,03 medical and health sciences ,0302 clinical medicine ,030228 respiratory system ,Physiology (medical) ,Internal medicine ,Periodic breathing ,medicine ,Blood oxygenation ,Cardiology ,Humans ,Sleep ,business ,human activities ,030217 neurology & neurosurgery ,Oxygen saturation (medicine) - Abstract
Central sleep apnea (CSA) is universal during ascent to high altitude, with intermittent and transient fluctuations in oxygen saturation, but the consequences on mean sleeping blood oxygenation are unclear. We assessed indices of CSA and mean sleeping peripheral oxygen saturation ([Formula: see text]) during ascent to high altitude using two ascent profiles: rapid ascent and residence at 3,800 m and incremental ascent to 5,160 m. The severity of CSA was not correlated with mean sleeping [Formula: see text] with ascent.
- Published
- 2021
- Full Text
- View/download PDF
3. Peripheral hypercapnic chemosensitivity in trained and untrained females and males during exercise
- Author
-
Leah M. Mann, Jason S. Chan, Sarah A. Angus, Connor J. Doherty, Benjamin P. Thompson, Glen E. Foster, Richard L. Hughson, and Paolo B. Dominelli
- Subjects
Male ,Hypercapnia ,Exercise Tolerance ,Physiology ,Physiology (medical) ,Exercise Test ,Humans ,Female ,Carbon Dioxide ,Exercise - Abstract
The hypercapnic chemoresponse to transient CO2 showed an increase during acute physical activity; however, this response did not persist with further increases in intensity and was not different between participants of different aerobic fitness. Males and females show a differing response to CO2 during exercise when compared with an iso-V̇co2. Our results suggest that adaptations that lead to increased aerobic fitness do not impact the hypercapnic ventilatory response but there is an effect of sex.
- Published
- 2022
4. Impact of wearing a surgical and cloth mask during cycle exercise
- Author
-
Yannick Molgat-Seon, Sarah A Angus, Paolo B. Dominelli, Connor J Doherty, Jason S. Chan, and Leah M Mann
- Subjects
Adult ,Male ,medicine.medical_specialty ,Physiology ,Partial Pressure ,Endocrinology, Diabetes and Metabolism ,030204 cardiovascular system & hematology ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,Respiratory Rate ,Heart Rate ,Physiology (medical) ,Pressure ,Tidal Volume ,Humans ,Medicine ,Cycle exercise ,Exercise ,Mouth ,Nutrition and Dietetics ,business.industry ,Masks ,COVID-19 ,Cardiopulmonary exercise ,Equipment Design ,General Medicine ,Carbon Dioxide ,030210 environmental & occupational health ,Oxygen ,Dyspnea ,Face ,Oxyhemoglobins ,Exercise Test ,Physical therapy ,Female ,Skin Temperature ,business ,Cycling - Abstract
We sought to determine the impact of wearing cloth or surgical masks on the cardiopulmonary responses to moderate-intensity exercise. Twelve subjects (n = 5 females) completed three, 8-min cycling trials while breathing through a non-rebreathing valve (laboratory control), cloth, or surgical mask. Heart rate (HR), oxyhemoglobin saturation (SpO2), breathing frequency, mouth pressure, partial pressure of end-tidal carbon dioxide (PetCO2) and oxygen (PetO2), dyspnea were measured throughout exercise. A subset of n = 6 subjects completed an additional exercise bout without a mask (ecological control). There were no differences in breathing frequency, HR or SpO2 across conditions (all p > 0.05). Compared with the laboratory control (4.7 ± 0.9 cmH2O [mean ± SD]), mouth pressure swings were smaller with the surgical mask (0.9 ± 0.7; p < 0.0001), but similar with the cloth mask (3.6 ± 4.8 cmH2O; p = 0.66). Wearing a cloth mask decreased PetO2 (−3.5 ± 3.7 mm Hg) and increased PetCO2 (+2.0 ± 1.3 mm Hg) relative to the ecological control (both p < 0.05). There were no differences in end-tidal gases between mask conditions and laboratory control (both p > 0.05). Dyspnea was similar between the control conditions and the surgical mask (p > 0.05) but was greater with the cloth mask compared with laboratory (+0.9 ± 1.2) and ecological (+1.5 ± 1.3) control conditions (both p < 0.05). Wearing a mask during short-term moderate-intensity exercise may increase dyspnea but has minimal impact on the cardiopulmonary response. Novelty: Wearing surgical or cloth masks during exercise has no impact on breathing frequency, tidal volume, oxygenation, and heart rate However, there are some changes in inspired and expired gas fractions that are physiologically irrelevant. In young healthy individuals, wearing surgical or cloth masks during submaximal exercise has few physiological consequences.
- Published
- 2021
- Full Text
- View/download PDF
5. Blood glucose concentration is unchanged during exposure to acute normobaric hypoxia in healthy humans
- Author
-
Zahrah H. Rampuri, Normand G. Boulé, Trevor A. Day, Garrick Chan, Alexandra E. Chiew, Craig D. Steinback, Jason S. Chan, Mackenzie D. Kozak, Margie H. Davenport, and Alexander N. Rimke
- Subjects
Adult ,Blood Glucose ,Male ,medicine.medical_specialty ,Physiology ,Glucose ingestion ,030204 cardiovascular system & hematology ,blood [glucose] regulation ,Beverages ,acute hypoxia ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,Heart Rate ,Physiology (medical) ,Internal medicine ,QP1-981 ,insulin sensitivity ,Humans ,Medicine ,Hypoxia ,030304 developmental biology ,0303 health sciences ,Normobaric hypoxia ,Cross-Over Studies ,business.industry ,Original Articles ,High altitude hypoxia ,Metabolism ,Hypoxia (medical) ,Oxygen ,Endocrinology ,Oxygen Saturation ,Room air distribution ,Original Article ,Female ,Normal blood ,Blood sugar regulation ,medicine.symptom ,business ,acute hyperglycemia - Abstract
Normal blood [glucose] regulation is critical to support metabolism, particularly in contexts of metabolic stressors (e.g., exercise, high altitude hypoxia). Data regarding blood [glucose] regulation in hypoxia are inconclusive. We aimed to characterize blood [glucose] over 80 min following glucose ingestion during both normoxia and acute normobaric hypoxia. In a randomized cross‐over design, on two separate days, 28 healthy participants (16 females; 21.8 ± 1.6 years; BMI 22.8 ± 2.5 kg/m2) were randomly exposed to either NX (room air; fraction of inspired [FI]O2 ~0.21) or HX (FIO2 ~0.148) in a normobaric hypoxia chamber. Measured FIO2 and peripheral oxygen saturation were both lower at baseline in hypoxia (p 0.77). In addition, mean, peak, and time‐to‐peak responses during the 80 min were not different between conditions (p > 0.14). There were also no sex differences in these blood [glucose] responses in hypoxia. We conclude that glucose regulation is unchanged in young, healthy participants with exposure to acute steady‐state normobaric hypoxia, likely due to counterbalancing mechanisms underlying blood [glucose] regulation in hypoxia., In a large sample of young healthy male and female participants, blood glucose homeostasis in response to a standard oral glucose load (75 g, 296 ml) was unchanged in response to exposure to acute normobaric hypoxia (FIO2 14.8%) compared to room air over 80‐min, using multiple metrics including mean, peak, and time‐to‐peak responses.
- Published
- 2021
- Full Text
- View/download PDF
6. Inferring a spatial code of cell-cell interactions across a whole animal body
- Author
-
Eyleen J. O’Rourke, Erick Armingol, Nathan E. Lewis, Jason S. Chan, Chintan Joshi, Abbas Ghaddar, Hsuan-Lin Her, Isaac Shamie, and Hratch M. Baghdassarian
- Subjects
Cell type ,medicine.anatomical_structure ,Interaction network ,Cell ,medicine ,Computational biology ,Biology ,biology.organism_classification ,Spatial analysis ,Phenotype ,Spatial organization ,Function (biology) ,Caenorhabditis elegans - Abstract
Summary Cell-cell interactions shape cellular function and ultimately organismal phenotype. However, the spatial code embedded in the molecular interactions driving and sustaining spatial organization remains to be elucidated. Here we present a computational framework to infer the spatial code underlying cell-cell interactions in a whole animal. Using the transcriptomes of the cell types composing Caenorhabditis elegans’ body, we compute the potential for intercellular interactions from the coexpression of ligand-receptor pairs. Leveraging a 3D atlas of C. elegans’ cells and a genetic algorithm we identify the ligand-receptor pairs most informative of the spatial organization of cells. The resulting intercellular distances are negatively correlated with the potential for cell-cell interaction, validating this strategy. Further, for selected ligand-receptor pairs, we experimentally confirm the algorithm-generated cell-cell interactions. Thus, our computational framework helps identify a code associated with spatial organization and cellular functions across a whole-animal body, showing that single-cell molecular measurements provide spatial information that may help elucidate organismal phenotypes and disease. Highlights -A cell-cell interaction network in the whole body of C. elegans is presented. -Intercellular distance and interactions are negatively correlated. -A combination of ligand-receptor pairs carries a spatial code of cell-cell interactions. -Spatial expression of specific ligand-receptor pairs is validated in vivo. Graphical abstract
- Published
- 2020
- Full Text
- View/download PDF
7. Minimizing airflow turbulence in women lowers the work of breathing to levels similar to men
- Author
-
Leah M Mann, Jason S. Chan, Emily A. Granger, Annie Yu, Paolo B. Dominelli, and Yannick Molgat-Seon
- Subjects
Male ,medicine.medical_specialty ,Physiology ,Airflow ,Heliox ,Helium ,03 medical and health sciences ,Work of breathing ,0302 clinical medicine ,Airway resistance ,Physiology (medical) ,Internal medicine ,Medicine ,Humans ,Exercise ,Work of Breathing ,Breathing room air ,business.industry ,Turbulence ,digestive, oral, and skin physiology ,Laminar flow ,030229 sport sciences ,respiratory system ,Oxygen ,Cardiology ,Breathing ,Female ,business ,030217 neurology & neurosurgery ,Research Article - Abstract
Smaller airways increase resistance and the propensity toward turbulent airflow, both of which are thought to be mechanisms behind greater resistive and total work of breathing (Wb) in females. Previous research examining the effect of airway size on the Wb between the sexes is limited by the inability to experimentally manipulate airway size. Heliox (21% oxygen, balance helium) is less dense than room air, which reduces turbulent airflow and airway resistance. The purpose of our study was to utilize heliox inspiration in women to provide a stimulus physiologically similar to increasing airway size. We hypothesized that when breathing heliox women would have a Wb similar to men breathing room air. Eighteen healthy young subjects (n = 9 women, 9 men) completed two maximal exercise tests on a cycle ergometer over 2 days. Subjects breathed room air for one test and heliox for the other. Wb was assessed with an esophageal balloon catheter. During the room air trial, when ventilations were >65 L/min, women had a significantly greater Wb compared with men (P < 0.05). The greater Wb in women was due to greater resistance to turbulent flow. For both sexes, breathing heliox resulted in increased expiratory flow (+132 ± 18% of room air), an elimination of expiratory flow limitation, and a reduction in Wb (69 ± 12% of room air) (all P < 0.05). When the women were breathing heliox, Wb was not different from that in the men breathing room air. Our findings support the idea that the smaller conducting airways in females are responsible for a greater total and resistive Wb. NEW & NOTEWORTHY When healthy young women breathe heliox gas during exercise, their work of breathing is not different from men breathing room air. Heliox inspiration reduces airway resistance and promotes laminar flow, which is a physiologically similar effect of increasing airway size. Our findings provide experimental evidence that smaller airways in women are responsible for the greater work of breathing during exercise.
- Published
- 2020
8. Impact Of Wearing A Mask During Cycling Exercise
- Author
-
Leah M. Mann, Paolo B. Dominelli, Sarah A Angus, Yannick Molgat-Seon, Connor J. Doherty, and Jason S. Chan
- Subjects
medicine.medical_specialty ,Physical medicine and rehabilitation ,business.industry ,medicine ,Physical Therapy, Sports Therapy and Rehabilitation ,Orthopedics and Sports Medicine ,Cycling ,business - Published
- 2021
- Full Text
- View/download PDF
9. 249Rehabilitating Perceptual Deficits in Fall-prone Older Adults: Improved Multisensory Processing Following 3 Day Perceptual Training
- Author
-
Jessica O’Brien, Finola Cronin, Jason S. Chan, Annalisa Setti, and Kieran O'Connor
- Subjects
Aging ,business.industry ,Perceptual learning ,Perception ,media_common.quotation_subject ,Medicine ,General Medicine ,Geriatrics and Gerontology ,business ,Perceptual training ,media_common ,Cognitive psychology - Published
- 2017
- Full Text
- View/download PDF
10. 060 Challenging the current assessment criteria for scoring central sleep apnea at altitude
- Author
-
Jordan Bird, Anne Kalker, Jason S. Chan, Tom D. Brutsaert, Trevor A. Day, Alexander N. Rimke, Mingma T. Sherpa, and Garrick Chan
- Subjects
medicine.medical_specialty ,Physical medicine and rehabilitation ,Central sleep apnea ,Altitude ,business.industry ,Physiology (medical) ,medicine ,Neurology (clinical) ,Current (fluid) ,medicine.disease ,business ,respiratory tract diseases - Abstract
Introduction Sleep disordered breathing comes in two forms: obstructive and central sleep apnea (SA). Obstructive sleep apnea (OSA) is caused by upper airway collapse during sleep, and is associated with increases in morbidity and mortality. Conversely, central sleep apnea (CSA) results from increases in respiratory chemosensitivity to blood gas challenges in the context of high-altitude ascent. CSA increases in severity and apneas shorten in duration with higher ascent and/or time spent at altitude. Although both types of SA are characterized by intermittent periods of apnea and hyperventilation, the underlying mechanisms and phenotypes between OSA and CSA are different. A universal scoring system for the two types of context-dependent SA may lead to errors in quantification. The American Association of Sleep Medicine (AASM) developed assessment criteria for SA, which are universally-utilized for all types of SA to quantify an apnea-hypopnea index (AHI; events/hour), where apneas are scored as cessation of breathing ≥10-sec. We aimed to assess the effect of reducing the apnea-detection threshold (ADT) to Methods We assessed CSA using portable polysomnography (ApneaLink, ResMed) during ascent to 5160m in the Nepal Himalaya over 10 days in 15 healthy participants. Files were archived digitally for later analysis using automated scoring software (ApneaLink Reporting Software, ResMed). We quantified and compared AHI using AASM criteria (i.e., 10-sec ADT) and a shorter 5-sec ADT. Results AHI was 3.9±4.1 events/hour at 1045m prior to ascent, with AHI increasing to 37.5±32.8 events/hour (P Conclusion This preliminary report suggests that the AASM criterion for scoring apneas, which is broadly applied to OSA at low altitude, may underestimate the assessment and quantification of CSA with ascent to and prolonged stays at high altitude. Development of distinct assessment criteria for OSA and CSA may be warranted. Support (if any) Natural Science sand Engineering Research Council of Canada
- Published
- 2021
- Full Text
- View/download PDF
11. Severity of Central Sleep Apnea Does Not Improve Sleeping Oxygen Saturation During Ascent to High Altitude
- Author
-
Trevor A. Day, Tom D. Brutsaert, Alexander N. Rimke, Richard J. A. Wilson, Nicholas G. Jendzjowsky, Jordan Bird, Mingma T. Sherpa, Garrick Chan, A. Kalker, and Jason S. Chan
- Subjects
medicine.medical_specialty ,Central sleep apnea ,business.industry ,Effects of high altitude on humans ,medicine.disease ,Biochemistry ,Internal medicine ,Genetics ,Cardiology ,medicine ,business ,Molecular Biology ,Biotechnology ,Oxygen saturation (medicine) - Published
- 2020
- Full Text
- View/download PDF
12. Reducing Turbulent Airflow Lowers Healthy Females Work Of Breathing During Exercise To A Level Similar To Males
- Author
-
Paolo B. Dominelli, Emily A. Granger, Annie Yu, Leah M Mann, and Jason S. Chan
- Subjects
medicine.medical_specialty ,Work of breathing ,business.industry ,Internal medicine ,Turbulent airflow ,Genetics ,medicine ,Cardiology ,business ,Molecular Biology ,Biochemistry ,Biotechnology - Published
- 2020
- Full Text
- View/download PDF
13. Improving audio-visual temporal perception through training enhances beta-band activity
- Author
-
Jochen Kaiser, Marcus J. Naumer, Stephanie Theves, and Jason S. Chan
- Subjects
Adult ,Male ,medicine.medical_specialty ,Formative Feedback ,Cognitive Neuroscience ,Stimulus (physiology) ,Audiology ,050105 experimental psychology ,lcsh:RC321-571 ,03 medical and health sciences ,Judgment ,Young Adult ,0302 clinical medicine ,Multisensory integration ,medicine ,Magnetoencephalography (MEG) ,Humans ,0501 psychology and cognitive sciences ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,Multisensory temporal learning ,medicine.diagnostic_test ,Artificial neural network ,Two-alternative forced choice ,05 social sciences ,Brain ,Magnetoencephalography ,Stimulus onset asynchrony ,Neurology ,Asynchronous communication ,Time Perception ,Auditory Perception ,Visual Perception ,Female ,Percept ,Beta-band activity ,Psychology ,Beta Rhythm ,030217 neurology & neurosurgery - Abstract
Multisensory integration strongly depends on the temporal proximity between two inputs. In the audio-visual domain, stimulus pairs with delays up to a few hundred milliseconds can be perceived as simultaneous and integrated into a unified percept. Previous research has shown that the size of this temporal window of integration can be narrowed by feedback-guided training on an audio-visual simultaneity judgment task. Yet, it has remained uncertain how the neural network that processes audio-visual asynchronies is affected by the training. In the present study, participants were trained on a 2-interval forced choice audio-visual simultaneity judgment task. We recorded their neural activity with magnetoencephalography in response to three different stimulus onset asynchronies (0 ms, each participant’s individual binding window, 300 ms) before, and one day following training. The Individual Window stimulus onset asynchrony condition was derived by assessing each participant’s point of subjective simultaneity. Training improved performance in both asynchronous stimulus onset conditions (300 ms, Individual Window). Furthermore, beta-band amplitude (12–30 Hz) increased from pre-compared to post-training sessions. This increase moved across central, parietal, and temporal sensors during the time window of 80–410 ms post-stimulus onset. Considering the putative role of beta oscillations in carrying feedback from higher to lower cortical areas, these findings suggest that enhanced top-down modulation of sensory processing is responsible for the improved temporal acuity after training. As beta oscillations can be assumed to also preferentially support neural communication over longer conduction delays, the widespread topography of our effect could indicate that training modulates not only processing within primary sensory cortex, but rather the communication within a large-scale network.
- Published
- 2018
14. Apnea-hypopnea index is associated with increased loop gain during sleep at ascending altitudes
- Author
-
Alexander N. Rimke, B.M. Shafer, S.A. Sands, Trevor A. Day, G.E. Foster, G. Saran, Jason S. Chan, Garrick Chan, and A. Kalker
- Subjects
medicine.medical_specialty ,Apnea–hypopnea index ,business.industry ,Internal medicine ,medicine ,Cardiology ,General Medicine ,business ,Sleep in non-human animals ,Loop gain - Published
- 2019
- Full Text
- View/download PDF
15. Synaesthesia or Vivid Imagery? A Single Case fMRI Study of Visually Induced Olfactory Perception
- Author
-
Stefanie Hardt, Jörn Lötsch, Jasper J. F. van den Bosch, Marcus J. Naumer, Patrick Pflanz, Jochen Kaiser, Stephanie Theves, and Jason S. Chan
- Subjects
Adult ,Male ,Visual perception ,genetic structures ,Cognitive Neuroscience ,media_common.quotation_subject ,Piriform Cortex ,Experimental and Cognitive Psychology ,Stimulus (physiology) ,Perceptual Disorders ,Primary olfactory cortex ,Perception ,Piriform cortex ,medicine ,Humans ,Synesthesia ,Vision, Ocular ,media_common ,Middle Aged ,Olfactory Perception ,medicine.disease ,Magnetic Resonance Imaging ,Sensory Systems ,Olfactory stimulus ,Smell ,Ophthalmology ,Imagination ,Visual Perception ,Computer Vision and Pattern Recognition ,Percept ,Psychology ,Photic Stimulation ,psychological phenomena and processes ,Cognitive psychology - Abstract
The most common form of synaesthesia is grapheme–colour synaesthesia. However, rarer forms of synaesthesia also exist, such as word–gustatory and olfactory–gustatory synaesthesia, whereby a word or smell will induce a specific. In this study we describe a single individual (LJ) who experiences a concurrent olfactory stimulus when presented with congruent visual images. For some visual stimuli, he perceives a strong and automatic olfactory percept, which has existed throughout his life. In this study, we explore whether his experiences are a new form of synaesthesia or simply vivid imagery. Unlike other forms of synaesthesia, the concurrent odour is congruent to the visual inducer. For example, a photograph of dress shoes will elicit the smell of leather. We presented LJ and several control participants with 75 images of everyday objects. Their task was to indicate the strength of any perceived odours induced by the visual images. LJ rated several of the images as inducing a concurrent odour, while controls did not have any such percept. Images that LJ reported as inducing the strongest odours were used, along with colour-matched control images, in the context of an fMRI experiment. Participants were given a one-back task to maintain attention. A block-design odour localizer was presented to localize the piriform cortex (primary olfactory cortex). We found an increased BOLD response in the piriform cortex for the odour-inducing images compared to the control images in LJ. There was no difference in BOLD response between these two stimulus types in the control participants. A subsequent olfactory imagery task did not elicit enhanced activity in the piriform cortex in LJ, suggesting his perceptual experiences may not be based on olfactory imagery.
- Published
- 2014
- Full Text
- View/download PDF
16. Evidence for Crossmodal Interactions across Depth on Target Localisation Performance in a Spatial Array
- Author
-
Fiona N. Newell, Danuta Lisiecka, Annalisa Setti, Jason S. Chan, and Corrina Maguinness
- Subjects
Adult ,Male ,Communication ,Crossmodal ,business.industry ,Experimental and Cognitive Psychology ,Neuropsychological Tests ,Stimulus (physiology) ,Horizontal plane ,Sensory Systems ,Young Adult ,Ophthalmology ,Artificial Intelligence ,Space Perception ,Auditory Perception ,Visual Perception ,Auditory stimuli ,Humans ,Female ,Computer vision ,Artificial intelligence ,Psychology ,business - Abstract
Auditory stimuli are known to improve visual target recognition and detection when both are presented in the same spatial location. However, most studies have focused on crossmodal spatial congruency along the horizontal plane and the effects of audio-visual spatial congruency in depth (ie along the depth axis) are relatively less well understood. In the following experiments we presented a visual (face) or auditory (voice) target stimulus in a location on a spatial array which was either spatially congruent or incongruent in depth (ie positioned directly in front or behind) with a crossmodal stimulus. The participant's task was to determine whether a visual (experiments 1 and 3) or auditory (experiment 2) target was located in the foreground or background of this array. We found that both visual and auditory targets were less accurately located when crossmodal stimuli were presented from different, compared to congruent, locations in depth. Moreover, this effect was particularly found for visual targets located in the periphery, although spatial incongruency affected the location of auditory targets across both locations. The relative distance of the array to the observer did not seem to modulate this congruency effect (experiment 3). Our results add to the growing evidence for multisensory influences on search performance and extend these findings to the localisation of targets in the depth plane.
- Published
- 2012
- Full Text
- View/download PDF
17. Familiarity of objects affects susceptibility to the sound-induced flash illusion
- Author
-
Annalisa Setti and Jason S. Chan
- Subjects
Adult ,Male ,Auditory perception ,Visual perception ,Adolescent ,Optical illusion ,General Neuroscience ,media_common.quotation_subject ,Illusion ,Multisensory integration ,Recognition, Psychology ,Middle Aged ,Stimulus (physiology) ,Illusions ,Acoustic Stimulation ,Phenomenon ,Auditory Perception ,Visual Perception ,Humans ,Female ,Psychology ,Auditory illusion ,Photic Stimulation ,Cognitive psychology ,media_common - Abstract
Audition is accepted as more reliable (thus dominant) than vision when temporal discrimination is required by the task. However, it is not known whether the characteristics of the visual stimulus, for example its familiarity to the perceiver, affect auditory dominance. In this study we manipulated familiarity of the visual stimulus in a well-established multisensory phenomenon, i.e., the sound-induced flash illusion. This illusion occurs when, for example, one brief visual stimulus (e.g., a flash) is presented in close temporal proximity with two brief sounds; participants perceive two flashes instead of one. We found that when the visual stimuli (faces or buildings) were familiar, participants were less susceptible to the illusion than when they were unfamiliar. As the illusion has been ascribed to early cross-sensory interactions between vision and audition, the present work offers behavioural evidence that high level processing of objects' characteristics such as familiarity, affects early temporal multisensory integration. Possible mechanisms underlying the effect of familiarity are discussed.
- Published
- 2011
- Full Text
- View/download PDF
18. Explaining autism spectrum disorders: central coherence vs. predictive coding theories
- Author
-
Marcus J. Naumer and Jason S. Chan
- Subjects
Male ,Predictive coding ,genetic structures ,Physiology ,General Neuroscience ,Multisensory integration ,Coherence (statistics) ,medicine.disease ,behavioral disciplines and activities ,Spectrum (topology) ,Acoustic Stimulation ,Child Development Disorders, Pervasive ,Autism spectrum disorder ,mental disorders ,Auditory Perception ,Reaction Time ,Visual Perception ,medicine ,Humans ,Autism ,Female ,Brief Communications ,Psychology ,Photic Stimulation ,Cognitive psychology - Abstract
The new DSM-5 diagnostic criteria for autism spectrum disorders (ASDs) include sensory disturbances in addition to the well-established language, communication, and social deficits. One sensory disturbance seen in ASD is an impaired ability to integrate multisensory information into a unified percept. This may arise from an underlying impairment in which individuals with ASD have difficulty perceiving the temporal relationship between cross-modal inputs, an important cue for multisensory integration. Such impairments in multisensory processing may cascade into higher-level deficits, impairing day-to-day functioning on tasks, such as speech perception. To investigate multisensory temporal processing deficits in ASD and their links to speech processing, the current study mapped performance on a number of multisensory temporal tasks (with both simple and complex stimuli) onto the ability of individuals with ASD to perceptually bind audiovisual speech signals. High-functioning children with ASD were compared with a group of typically developing children. Performance on the multisensory temporal tasks varied with stimulus complexity for both groups; less precise temporal processing was observed with increasing stimulus complexity. Notably, individuals with ASD showed a speech-specific deficit in multisensory temporal processing. Most importantly, the strength of perceptual binding of audiovisual speech observed in individuals with ASD was strongly related to their low-level multisensory temporal processing abilities. Collectively, the results represent the first to illustrate links between multisensory temporal function and speech processing in ASD, strongly suggesting that deficits in low-level sensory processing may cascade into higher-order domains, such as language and communication.
- Published
- 2014
- Full Text
- View/download PDF
19. Behavioral evidence for task-dependent 'what' versus 'where' processing within and across modalities
- Author
-
Fiona N. Newell and Jason S. Chan
- Subjects
Adult ,Male ,Adolescent ,Experimental and Cognitive Psychology ,Sensory system ,Task (project management) ,Mental Processes ,Stimulus modality ,Neuroimaging ,Psychology ,Humans ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,General Psychology ,Haptic technology ,Modality (human–computer interaction) ,Modalities ,Information processing ,Recognition, Psychology ,Sensory Systems ,Form Perception ,Touch ,Space Perception ,Visual Perception ,Female ,Color Perception ,Cognitive psychology - Abstract
PUBLISHED, Task-dependent information processing for the purpose of recognition or spatial perception is considered a principle common to all the main sensory modalities. Using a dual-task interference paradigm, we investigated the behavioral effects of independent information processing for shape identification and localization of object features within and across vision and touch. In Experiment 1, we established that color and texture processing (i.e., a ?what? task) interfered with both visual and haptic shape-matching tasks and that mirror image and rotation matching (i.e., a ?where? task) interfered with a feature-location-matching task in both modalities. In contrast, interference was reduced when a ?where? interference task was embedded in a ?what? primary task and vice versa. In Experiment 2, we replicated this finding within each modality, using the same interference and primary tasks throughout. In Experiment 3, the interference tasks were always conducted in a modality other than the primary task modality. Here, we found that resources for identification and spatial localization are independent of modality. Our findings further suggest that multisensory resources for shape recognition also involve resources for spatial localization. These results extend recent neuropsychological and neuroimaging findings and have important implications for our understanding of high-level information processing across the human sensory systems., This study was funded by the European Commission under ?Information Society Technologies? Program Grant IST?2001-34712 and by the Irish Research Council for the Humanities and Social Sciences.
- Published
- 2008
- Full Text
- View/download PDF
20. Temporal integration of multisensory stimuli in autism spectrum disorder: a predictive coding perspective
- Author
-
Jochen Kaiser, Anne Langer, and Jason S. Chan
- Subjects
Male ,Simultaneity ,Time Factors ,Autism Spectrum Disorder ,media_common.quotation_subject ,Illusion ,050105 experimental psychology ,03 medical and health sciences ,Typically developing ,0302 clinical medicine ,Predictive Value of Tests ,medicine ,Humans ,0501 psychology and cognitive sciences ,Biological Psychiatry ,media_common ,Predictive coding ,05 social sciences ,Perspective (graphical) ,Multisensory integration ,medicine.disease ,Illusions ,Asynchrony (computer programming) ,Psychiatry and Mental health ,Sound ,Neurology ,Acoustic Stimulation ,Autism spectrum disorder ,Auditory Perception ,Visual Perception ,Female ,Neurology (clinical) ,Psychology ,030217 neurology & neurosurgery ,Photic Stimulation ,Cognitive psychology - Abstract
Recently, a growing number of studies have examined the role of multisensory temporal integration in people with autism spectrum disorder (ASD). Some studies have used temporal order judgments or simultaneity judgments to examine the temporal binding window, while others have employed multisensory illusions, such as the sound-induced flash illusion (SiFi). The SiFi is an illusion created by presenting two beeps along with one flash. Participants perceive two flashes if the stimulus-onset asynchrony (SOA) between the two flashes is brief. The temporal binding window can be measured by modulating the SOA between the beeps. Each of these tasks has been used to compare the temporal binding window in people with ASD and typically developing individuals; however, the results have been mixed. While temporal order and simultaneity judgment tasks have shown little temporal binding window differences between groups, studies using the SiFi have found a wider temporal binding window in ASD compared to controls. In this paper, we discuss these seemingly contradictory findings and suggest that predictive coding may be able to explain the differences between these tasks.
- Published
- 2015
21. Presenting multiple auditory signals using multiple sound cards in Visual Basic 6.0
- Author
-
Jason S. Chan and Charles Spence
- Subjects
Visual Basic ,Psychology, Experimental ,Computer science ,Speech recognition ,media_common.quotation_subject ,Auditory display ,Pentium ,Experimental and Cognitive Psychology ,DirectX ,Cocktail party effect ,Presentation ,Acoustic Stimulation ,Microcomputers ,Embodied cognition ,Auditory Perception ,Psychology (miscellaneous) ,Software requirements ,computer ,Software ,General Psychology ,media_common ,computer.programming_language - Abstract
In auditory research, it is often desirable to present more than two auditory stimuli at any one time. Although the technology has been available for some time, the majority of researchers have not utilized it. This article provides a simple means of presenting multiple, concurrent, independent auditory events, using two or more different sound cards installed within a single computer. By enabling the presentation of more auditory events, we can hope to gain a better understanding of the cognitive and attentional processes operating under more complex and realistic scenes, such as that embodied by the cocktail party effect. The software requirements are Windows 98SR2/Me/NT4/2000/XP, Visual Basic 6.0, and DirectX 7.0 or above. The hardware requirements are a Pentium II, 128 MB RAM, and two or more different sound cards.
- Published
- 2003
- Full Text
- View/download PDF
22. Expanded temporal binding windows in people with mild cognitive impairment
- Author
-
Michael Hogan, David Prvulovic, Mareike Brandl, Silke Matura, Jason S. Chan, Jochen Kaiser, and Marcus J. Naumer
- Subjects
Male ,Time Factors ,media_common.quotation_subject ,Illusion ,Sensory system ,Neuropsychological Tests ,Statistics, Nonparametric ,Perception ,Reaction Time ,Humans ,Cognitive Dysfunction ,Cognitive impairment ,media_common ,Aged ,Language impairment ,Cognition ,Verbal Learning ,Illusions ,Neurology ,Acoustic Stimulation ,Clinical diagnosis ,Female ,Neurology (clinical) ,Psychology ,Photic Stimulation ,Cognitive psychology - Abstract
Previous studies investigating mild cognitive impairment (MCI) have focused primarily on cognitive, memory, attention, and executive function deficits. There has been relatively little research on the perceptual deficits people with MCI may exhibit. This is surprising given that it has been suggested that sensory and cognitive functions share a common cortical framework [1]. In the following study, we presented the sound-induced flash illusion (SiFi) to a group of participants with mild cognitive impairment (MCI) and healthy controls (HC). The SiFi is an audio-visual illusion whereby two-beeps and one-flash are presented. Participants tend to perceive two flashes when the time-interval between the auditory beeps is small [2, 3]. Participants with MCI perceived significantly more illusions compared to HC over longer auditory time-intervals. This suggests that MCIs integrate more (arguably irrelevant) audiovisual information compared to HCs. By incorporating perceptual tasks into a clinical diagnosis it may be possible to gain a more comprehensive understanding into the disease, as well as provide a more accurate diagnose to those who may have a language impairment.
- Published
- 2014
23. 208INTRA-INDIVIDUAL DIFFERENCES IN SUSCEPTIBILITY TO THE SOUND-INDUCED FLASH ILLUSION
- Author
-
Shannon K. Connolly, Jason S. Chan, and Annalisa Setti
- Subjects
Aging ,geography ,geography.geographical_feature_category ,media_common.quotation_subject ,05 social sciences ,Illusion ,General Medicine ,050105 experimental psychology ,03 medical and health sciences ,Flash (photography) ,0302 clinical medicine ,0501 psychology and cognitive sciences ,Geriatrics and Gerontology ,Psychology ,030217 neurology & neurosurgery ,Sound (geography) ,Cognitive psychology ,media_common - Published
- 2016
- Full Text
- View/download PDF
24. Static images of novel, moveable objects learned through touch activate visual area hMT+
- Author
-
Fiona N. Newell, Hugh Garavan, Jason S. Chan, and Cristina Simões-Franklin
- Subjects
Adult ,Male ,genetic structures ,Cognitive Neuroscience ,Motion Perception ,Neuropsychological Tests ,Motion (physics) ,Motion ,Young Adult ,Stimulus modality ,Physical Stimulation ,Humans ,Learning ,Computer vision ,Visual Pathways ,Set (psychology) ,Haptic technology ,Brain Mapping ,Modality (human–computer interaction) ,Crossmodal ,business.industry ,Cognitive neuroscience of visual object recognition ,Brain ,Middle Aged ,Magnetic Resonance Imaging ,Neurology ,Touch Perception ,Pattern Recognition, Physiological ,Pattern recognition (psychology) ,Visual Perception ,Female ,Artificial intelligence ,business ,Psychology ,psychological phenomena and processes ,Photic Stimulation - Abstract
Although many studies have found similar cortical areas activated during the recognition of objects encoded through vision or touch, little is known about cortical areas involved in the crossmodal recognition of dynamic objects. Here, we investigated which cortical areas are involved in the recognition of moving objects and were specifically interested in whether motion areas are involved in the recognition of dynamic objects within and across sensory modalities. Prior to scanning, participants first learned to recognise a set of 12 novel objects, each presented either visually or haptically, and either moving or stationary. We then conducted fMRI whilst participants performed an old-new task with static images of learned or not-learned objects. We found the fusiform and right inferior frontal gyri more activated to within-modal visual than crossmodal object recognition. Our results also revealed increased activation in area hMT+, LOC and the middle occipital gyrus, in the right hemisphere only, for the objects learned as moving compared to the learned static objects, regardless of modality. We propose that the network of cortical areas involved in the recognition of dynamic objects is largely independent of modality and have important implications for understanding the neural substrates of multisensory dynamic object recognition.
- Published
- 2009
25. Investigating Visuo-tactile Recognition of Unfamiliar Moving Objects
- Author
-
Fiona N. Newell, T. Aisling Whitaker, and Jason S. Chan
- Subjects
Haptic memory ,Form perception ,Computer science ,business.industry ,3D single-object recognition ,Deep-sky object ,Cognitive neuroscience of visual object recognition ,Object model ,Computer vision ,Artificial intelligence ,Object (computer science) ,business ,Haptic technology - Abstract
Previous research on haptic object recognition has focused mainly on static objects and very little is understood about the role of dynamic information in haptic object recognition. In this study we examined if motion, particularly dynamic object parts, is combined with shape information in the representation of an object in haptic memory. In our behavioural studies we found that target objects previously learned as moving objects were more easily recognized when presented dynamically than when presented as static objects, even though, shape information alone was sufficient to recognize each object. Moreover, cross-modal, visuo-tactile object recognition was better for dynamic than static objects.
- Published
- 2008
- Full Text
- View/download PDF
26. The influence of gender incongruence on the McGurk-percept: A combined behavioural and fMRI study
- Author
-
Jochen Kaiser, Jasper J. F. van den Bosch, Annika Notbohm, Jason S. Chan, and Marcus J. Naumer
- Subjects
Neural correlates of consciousness ,Fusiform gyrus ,Speech perception ,genetic structures ,Cognitive Neuroscience ,media_common.quotation_subject ,Illusion ,Experimental and Cognitive Psychology ,Context (language use) ,Sensory Systems ,Developmental psychology ,Ophthalmology ,Superior temporal gyrus ,McGurk effect ,Computer Vision and Pattern Recognition ,Percept ,Psychology ,psychological phenomena and processes ,Cognitive psychology ,media_common - Abstract
The McGurk-effect (McGurk and MacDonald, 1976) is a robust illusion which is broadly studied in the context of audiovisual integration. In the illusion, auditory speech perception is modified by discrepant visual lip-movements when presented synchronously, leading to a third, not physically present percept. There is an ongoing debate as to whether the percept is integrated early in the sensory system or if it is susceptible to cognitive intervention. Here, the McGurk-effect was studied using gender congruency (face/voice stimuli) as the cognitive intervention to address the question of integration. We first investigated changes in prevalence and reaction times due to perceived gender discrepancy during the McGurk-percept. In the second experiment, neural correlates of the gender discrepant McGurk-percept were studied using fMRI. We did not find differences in reaction times or accuracy between gender congruent or incongruent stimuli. Thus it appears the McGurk-percept itself is unaffected by gender incongruent stimuli on a behavioural level. In the fMRI experiment, we found the superior temporal gyrus to reveal a significantly increased activity when the McGurk-effect is perceived as compared to non-McGurk trials, but no effect in this area was found for gender congruency. However, a 2 × 2 whole brain ANOVA revealed a significant interaction in the face identity processing area (fusiform gyrus) as well as inferior parietal gyrus and superior colliculus. We suggest a mechanism, allowing a stable illusory percept in the gender discrepant McGurk-percept to be facilitated. To our knowledge this is the first study to demonstrate on a neural level that gender congruency affects processing of the McGurk-effect.
- Published
- 2013
- Full Text
- View/download PDF
27. That smells blue! Differences between colour associations for odours and odour-evocative words
- Author
-
Kirsten J. McKenzie, Jiana Ren, Carmel A. Levitan, Andy T. Woods, Christine Xiang Ru Leong, Jai Levin, Jason S. Chan, and Michael Dodson
- Subjects
Ophthalmology ,Cognitive Neuroscience ,Odour perception ,Experimental and Cognitive Psychology ,Context (language use) ,Computer Vision and Pattern Recognition ,Psychology ,Social psychology ,Sensory cue ,Sensory Systems ,Preference - Abstract
Strong associations exist between specific odours and colours, and these associations have been found to be both consistent within populations and over time (Gilbert et al., 1996). Experimental manipulations of these associations have shown that both taste and odour perception rely heavily upon visual cues (e.g., Blackwell, 1995; Sakai et al., 2005); participants often make errors in odour judgements when stimuli have been artificially coloured (Morrot et al., 2001), and the presence of a strongly-associated colour can greatly enhance the detection of an odour and the intensity of aromas or flavours (Zellner and Kautz, 1990; Zellner and Whitten, 1999), as well as preference and enjoyment (Herz, 2001; Herz and Beland, 2004). Such associations between colour and odour appear to be based on prior experience (Blackwell, 1995; Morrot et al., 2001; Sakai et al., 2005; Stevenson and Oaten, 2008), and odours are usually perceived alongside visual, taste and tactile sensations, as well as higher order cues such as shape, size and object labelling. As such, an odour maybe perceived quite differently depending upon its current multisensory context, and experiencing an odour without these additional cues is likely to be different from experiencing the odour in a natural multisensory environment. Here we explore if odour-evocative words, rich in semantic connotations, differ in their colour associations compared to those associated with just odour. Twenty individuals were tested in each of four geographical locations; Germany, Malaysia, The Netherlands and the United States of America. Participants chose the three colours they most closely associated with both odours and odour-words from a chart of 36, using Xperiment software (www.xperiment.mobi). Preliminary results indicate that there were differences between odour-evocative words and odour cues in terms of the associated colours, for all populations.
- Published
- 2012
- Full Text
- View/download PDF
28. The effect of non-informative spatial sounds on haptic scene recognition
- Author
-
Fiona N. Newell and Jason S. Chan
- Subjects
Modality (human–computer interaction) ,General Computer Science ,Recall ,Computer science ,business.industry ,media_common.quotation_subject ,Spatial memory ,Stereotaxy ,Perception ,Encoding (memory) ,Contrast (vision) ,Computer vision ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,media_common ,Haptic technology - Abstract
Previous studies found that performance in tactile or haptic spatial tasks improved when non-informative visual information was available, suggesting that vision provides a precise spatial frame to which tactile information is referred. Here, we explored whether another intrinsically spatial modality, audition, can also affect haptic recognition. In all experiments, blindfolded participants first learned a scene through touch and were subsequently required to recognise the scene. We found no effect on haptic performance when white noise stimuli were presented from specific locations Experiment 1. However, performance was significantly reduced by pure tone stimuli presented from the same locations Experiment 2, moreover, these tones disrupted recall but not encoding of the haptic scene Experiment 3. In Experiment 4, we found that spatial rather than non-spatial auditory information was required to affect haptic performance. Finally, in Experiment 5 we found no specific benefit for familiar sound cues over unfamiliar or no sounds on haptic spatial performance. Our findings suggest that, in contrast to vision, auditory information is unlikely to have sufficient spatial precision therefore disrupting the spatial representation of haptic information. Our results add to a growing body of evidence for multisensory influences in the perception of space.
- Published
- 2013
- Full Text
- View/download PDF
29. Aurally aided visual search in depth using 'virtual' crowds of people
- Author
-
Fiona N. Newell, Henry J. Rice, Paul McDonald, Corrina Maguinness, Simon Dobbyn, Carol O'Sullivan, and Jason S. Chan
- Subjects
Visual search ,Ophthalmology ,Crowds ,Computer science ,Human–computer interaction ,Sensory Systems - Published
- 2010
- Full Text
- View/download PDF
30. The effects of characteristic and spatially congruent sounds on visual search in natural visual scenes
- Author
-
Fiona N. Newell, Jason S. Chan, and Daniel Rogers
- Subjects
Visual search ,Ophthalmology ,Communication ,business.industry ,Speech recognition ,Natural (music) ,Psychology ,business ,Sensory Systems - Published
- 2010
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.