19 results on '"Bayet, L."'
Search Results
2. The neural architecture and developmental course of face processing
- Author
-
Bayet, L., primary and Nelson, C.A., additional
- Published
- 2020
- Full Text
- View/download PDF
3. Contributors
- Author
-
Aguero, Ariel, primary, Akshoomoff, Natacha A., additional, Ango, Fabrice, additional, Bauer, Patricia J., additional, Bayet, L., additional, Beltz, Adriene M., additional, Berenbaum, Sheri A., additional, Bodison, Stefanie C., additional, Burton, S.D., additional, Buzzell, G.A., additional, Cheetham, Claire E.J., additional, Claire, Hughes, additional, Colby, John B., additional, Conejero, A., additional, Davis, Elysia Poggi, additional, Decety, Jean, additional, Doom, Jenalee R., additional, Dugan, Jessica A., additional, Engmann, Anne, additional, Feldman, Daniel E., additional, Finch, Kayla H., additional, Fox, N.A., additional, Gerfen, Charles R., additional, Gittis, Aryn H., additional, Goodrich, L.V., additional, Gunnar, Megan R., additional, Haist, Frank, additional, Hawkes, Richard, additional, Hooks, Bryan M., additional, Johnson, Mark H., additional, Johnson, Scott P., additional, Kano, Masanobu, additional, Kanold, P.O., additional, Kelly, Dominic P., additional, Kim, Taehyeon, additional, Lahat, A., additional, Lany, Jill, additional, Lepousez, G., additional, Lledo, P.-M., additional, Macklis, Jeffrey D., additional, Michalska, Kalina J., additional, Molnár, Zoltán, additional, Nelson, C.A., additional, Noroña, Amanda N., additional, Ozkan, Abdulkadir, additional, Pignatelli, Michele, additional, Richardson, Hilary, additional, Rockland, Kathleen S., additional, Rowland, Benjamin A., additional, Rueda, M.R., additional, Sahni, Vibhu, additional, Saxe, Rebecca, additional, Sotelo, Constantino, additional, Sowell, Elizabeth R., additional, Stanford, Terrence R., additional, Stein, Barry E., additional, Stiles, Joan, additional, Tager-Flusberg, Helen, additional, Thompson, Abbie, additional, Wachowiak, M., additional, and Watanabe, Masahiko, additional
- Published
- 2020
- Full Text
- View/download PDF
4. Réévaluation et amélioration de la sécurité des barrages en maçonnerie et en béton
- Author
-
Ho Ta Khanh, M., Tinland, J.M., Bourdarot, E., Alonso, E., Rizzoli, J.L., Francq, J., Lino, M., Bayet, L., and Irstea Publications, Migration
- Subjects
[SDE] Environmental Sciences ,CEMAGREF - Abstract
The first part is about improvements made in the stability of gravity-dams and buttress-dams, and in the reinforcements performed on fourteen old masonry dams. The second part shows, using examples, how numerical modellings permit to explain the behavior of concrete dams and contribute to a better evaluation of their safety., La première partie traite de l'amélioration de la stabilité des barrages-poids et à contreforts et des renforcements effectués sur quatorze ouvrages anciens en maçonnerie. La deuxième partie montre, à l'aide d'exemples, comment les modélisations numériques permettent d'expliquer le comportement des barrages en béton et contribuent à une meilleure évaluation de leur sécurité.
- Published
- 1994
5. Time-resolved multivariate pattern analysis of infant EEG data: A practical tutorial.
- Author
-
Ashton K, Zinszer BD, Cichy RM, Nelson CA 3rd, Aslin RN, and Bayet L
- Subjects
- Adult, Child, Electroencephalography methods, Humans, Infant, Multivariate Analysis, Neuroimaging methods, Brain, Cognitive Neuroscience
- Abstract
Time-resolved multivariate pattern analysis (MVPA), a popular technique for analyzing magneto- and electro-encephalography (M/EEG) neuroimaging data, quantifies the extent and time-course by which neural representations support the discrimination of relevant stimuli dimensions. As EEG is widely used for infant neuroimaging, time-resolved MVPA of infant EEG data is a particularly promising tool for infant cognitive neuroscience. MVPA has recently been applied to common infant imaging methods such as EEG and fNIRS. In this tutorial, we provide and describe code to implement time-resolved, within-subject MVPA with infant EEG data. An example implementation of time-resolved MVPA based on linear SVM classification is described, with accompanying code in Matlab and Python. Results from a test dataset indicated that in both infants and adults this method reliably produced above-chance accuracy for classifying stimuli images. Extensions of the classification analysis are presented including both geometric- and accuracy-based representational similarity analysis, implemented in Python. Common choices of implementation are presented and discussed. As the amount of artifact-free EEG data contributed by each participant is lower in studies of infants than in studies of children and adults, we also explore and discuss the impact of varying participant-level inclusion thresholds on resulting MVPA findings in these datasets., (Copyright © 2022 The Authors. Published by Elsevier Ltd.. All rights reserved.)
- Published
- 2022
- Full Text
- View/download PDF
6. Sensitivity to face animacy and inversion in childhood: Evidence from EEG data.
- Author
-
Bayet L, Saville A, and Balas B
- Subjects
- Adult, Child, Child, Preschool, Cross-Sectional Studies, Humans, Orientation, Pattern Recognition, Visual, Photic Stimulation, Electroencephalography, Evoked Potentials
- Abstract
Adults exhibit relative behavioral difficulties in processing inanimate, artificial faces compared to real human faces, with implications for using artificial faces in research and designing artificial social agents. However, the developmental trajectory of inanimate face perception is unknown. To address this gap, we used electroencephalography to investigate inanimate faces processing in cross-sectional groups of 5-10-year-old children and adults. A face inversion manipulation was used to test whether face animacy processing relies on expert face processing strategies. Groups of 5-7-year-olds (N = 18), 8-10-year-olds (N = 18), and adults (N = 16) watched pictures of real or doll faces presented in an upright or inverted orientation. Analyses of event-related potentials revealed larger N170 amplitudes in response to doll faces, irrespective of age group or face orientation. Thus, the N170 is sensitive to face animacy by 5-7 years of age, but such sensitivity may not reflect high-level, expert face processing. Multivariate pattern analyses of the EEG signal additionally assessed whether animacy information could be reliably extracted during face processing. Face orientation, but not face animacy, could be reliably decoded from occipitotemporal channels in children and adults. Face animacy could be decoded from whole scalp channels in adults, but not children. Together, these results suggest that 5-10-year-old children exhibit some sensitivity to face animacy over occipitotemporal regions that is comparable to adults., (Copyright © 2021 Elsevier Ltd. All rights reserved.)
- Published
- 2021
- Full Text
- View/download PDF
7. Neural responses to happy, fearful and angry faces of varying identities in 5- and 7-month-old infants.
- Author
-
Bayet L, Perdue KL, Behrendt HF, Richards JE, Westerlund A, Cataldo JK, and Nelson CA 3rd
- Subjects
- Child, Preschool, Cross-Sectional Studies, Facial Expression, Female, Humans, Infant, Fear, Happiness
- Abstract
The processing of facial emotion is an important social skill that develops throughout infancy and early childhood. Here we investigate the neural underpinnings of the ability to process facial emotion across changes in facial identity in cross-sectional groups of 5- and 7-month-old infants. We simultaneously measured neural metabolic, behavioral, and autonomic responses to happy, fearful, and angry faces of different female models using functional near-infrared spectroscopy (fNIRS), eye-tracking, and heart rate measures. We observed significant neural activation to these facial emotions in a distributed set of frontal and temporal brain regions, and longer looking to the mouth region of angry faces compared to happy and fearful faces. No differences in looking behavior or neural activations were observed between 5- and 7-month-olds, although several exploratory, age-independent associations between neural activations and looking behavior were noted. Overall, these findings suggest more developmental stability than previously thought in responses to emotional facial expressions of varying identities between 5- and 7-months of age., (Copyright © 2020 The Authors. Published by Elsevier Ltd.. All rights reserved.)
- Published
- 2021
- Full Text
- View/download PDF
8. Temporal dynamics of visual representations in the infant brain.
- Author
-
Bayet L, Zinszer BD, Reilly E, Cataldo JK, Pruitt Z, Cichy RM, Nelson CA 3rd, and Aslin RN
- Subjects
- Adolescent, Adult, Female, Humans, Infant, Male, Young Adult, Brain physiology, Pattern Recognition, Visual physiology
- Abstract
Tools from computational neuroscience have facilitated the investigation of the neural correlates of mental representations. However, access to the representational content of neural activations early in life has remained limited. We asked whether patterns of neural activity elicited by complex visual stimuli (animals, human body) could be decoded from EEG data gathered from 12-15-month-old infants and adult controls. We assessed pairwise classification accuracy at each time-point after stimulus onset, for individual infants and adults. Classification accuracies rose above chance in both groups, within 500 ms. In contrast to adults, neural representations in infants were not linearly separable across visual domains. Representations were similar within, but not across, age groups. These findings suggest a developmental reorganization of visual representations between the second year of life and adulthood and provide a promising proof-of-concept for the feasibility of decoding EEG data within-subject to assess how the infant brain dynamically represents visual objects., (Copyright © 2020 The Authors. Published by Elsevier Ltd.. All rights reserved.)
- Published
- 2020
- Full Text
- View/download PDF
9. Pathways to social-emotional functioning in the preschool period: The role of child temperament and maternal anxiety in boys and girls.
- Author
-
Behrendt HF, Wade M, Bayet L, Nelson CA, and Bosquet Enlow M
- Subjects
- Anxiety, Child, Child, Preschool, Emotions, Female, Humans, Male, Social Adjustment, Problem Behavior, Temperament
- Abstract
Individual differences in social-emotional functioning emerge early and have long-term implications for developmental adaptation and competency. Research is needed that specifies multiple early risk factors and outcomes simultaneously to demonstrate specificity. Using multigroup longitudinal path analysis in a sample of typically developing children (N = 541), we examined child temperament dimensions (surgency, negative affectivity, and regulation/effortful control) and maternal anxiety in infancy and age 2 as predictors of child externalizing, internalizing, dysregulation, and competence behaviors at age 3. Four primary patterns emerged. First, there was stability in temperament dimensions and maternal anxiety from infancy to age 3. Second, negative affectivity was implicated in internalizing problems and surgency in externalizing problems. Third, effortful control at age 2 was a potent mediator of maternal anxiety in infancy on age 3 outcomes. Fourth, there was suggestive evidence for transactional effects between maternal anxiety and child effortful control. Most pathways operated similarly for boys and girls, with some differences, particularly for surgency. These findings expand our understanding of the roles of specific temperamental domains and postnatal maternal anxiety in a range of social-emotional outcomes in the preschool period, and have implications for efforts to enhance the development of young children's social-emotional functioning and reduce risk for later psychopathology.
- Published
- 2020
- Full Text
- View/download PDF
10. Auditory Processing of Speech and Tones in Children With Tuberous Sclerosis Complex.
- Author
-
O'Brien AM, Bayet L, Riley K, Nelson CA, Sahin M, and Modi ME
- Abstract
Individuals with Tuberous Sclerosis Complex (TSC) have atypical white matter integrity and neural connectivity in the brain, including language pathways. To explore functional activity associated with auditory and language processing in individuals with TSC, we used electroencephalography (EEG) to examine basic auditory correlates of detection (P1, N2, N4) and discrimination (mismatch negativity, MMN) of speech and non-speech stimuli for children with TSC and age- and sex-matched typically developing (TD) children. Children with TSC (TSC group) and without TSC (typically developing, TD group) participated in an auditory MMN paradigm containing two blocks of vowels (/a/and/u/) and two blocks of tones (800 Hz and 400 Hz). Continuous EEG data were collected. Multivariate pattern analysis (MVPA) was used to explore functional specificity of neural auditory processing. Speech-specific P1, N2, and N4 waveform components of the auditory evoked potential (AEP) were compared, and the mismatch response was calculated for both speech and tones. MVPA showed that the TD group, but not the TSC group, demonstrated above-chance pairwise decoding between speech and tones. The AEP component analysis suggested that while the TD group had an increased P1 amplitude in response to vowels compared to tones, the TSC group did not show this enhanced response to vowels. Additionally, the TD group had a greater N2 amplitude in response to vowels, but not tones, compared to the TSC group. The TSC group also demonstrated a longer N4 latency to vowels compared to tones, which was not seen in the TD group. No group differences were observed in the MMN response. In this study we identified features of the auditory response to speech sounds, but not acoustically matched tones, which differentiate children with TSC from TD children., (Copyright © 2020 O’Brien, Bayet, Riley, Nelson, Sahin and Modi.)
- Published
- 2020
- Full Text
- View/download PDF
11. Recognition of facial emotions of varying intensities by three-year-olds.
- Author
-
Bayet L, Behrendt HF, Cataldo JK, Westerlund A, and Nelson CA
- Subjects
- Child, Preschool, Female, Humans, Male, Child Development physiology, Emotions physiology, Facial Expression, Facial Recognition physiology, Recognition, Psychology physiology
- Abstract
Early facial emotion recognition is hypothesized to be critical to later social functioning. However, relatively little is known about the typical intensity thresholds for recognizing facial emotions in preschoolers, between 2 and 4 years of age. This study employed a behavioral sorting task to examine the recognition of happy, fearful, and angry expressions of varying intensity in a large sample of 3-year-old children (N = 208). Thresholds were similar for all expressions; accuracy, however, was significantly lower for fear. Fear and anger expressions above threshold were significantly more confused with one another than with other expressions. In contrast, neutral faces were significantly more often interpreted as happy than as angry or fearful. These results provide a comparison point for future studies of early facial emotion recognition in typical and atypical populations of children in this age group. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
- Published
- 2018
- Full Text
- View/download PDF
12. Dynamics of neural representations when searching for exemplars and categories of human and non-human faces.
- Author
-
Bayet L, Zinszer B, Pruitt Z, Aslin RN, and Wu R
- Subjects
- Adult, Animals, Attention physiology, Brain Mapping, Electroencephalography methods, Evoked Potentials physiology, Face, Female, Hominidae, Humans, Male, Photic Stimulation, Recognition, Psychology physiology, Visual Perception physiology, Evoked Potentials, Visual physiology, Facial Recognition physiology, Pattern Recognition, Visual physiology
- Abstract
Face perception abilities in humans exhibit a marked expertise in distinguishing individual human faces at the expense of individual faces from other species (the other-species effect). In particular, one behavioural effect of such specialization is that human adults search for and find categories of non-human faces faster and more accurately than a specific non-human face, and vice versa for human faces. However, a recent visual search study showed that neural responses (event-related potentials, ERPs) were identical when finding either a non-human or human face. We used time-resolved multivariate pattern analysis of the EEG data from that study to investigate the dynamics of neural representations during a visual search for own-species (human) or other-species (non-human ape) faces, with greater sensitivity than traditional ERP analyses. The location of each target (i.e., right or left) could be decoded from the EEG, with similar accuracy for human and non-human faces. However, the neural patterns associated with searching for an exemplar versus a category target differed for human faces compared to non-human faces: Exemplar representations could be more reliably distinguished from category representations for human than non-human faces. These findings suggest that the other-species effect modulates the nature of representations, but preserves the attentional selection of target items based on these representations.
- Published
- 2018
- Full Text
- View/download PDF
13. Classifying the mental representation of word meaning in children with Multivariate Pattern Analysis of fNIRS.
- Author
-
Gemignani J, Bayet L, Kabdebon C, Blankertz B, Pugh KR, and Aslin RN
- Subjects
- Child, Hemodynamics, Humans, Multivariate Analysis, Support Vector Machine, Spectroscopy, Near-Infrared
- Abstract
This study presents the implementation of a within-subject neural decoder, based on Support Vector Machines, and its application for the classification of distributed patterns of hemodynamic activation, measured with Functional Near Infrared Spectroscopy (fNIRS) on children, in response to meaningful and meaningless auditory stimuli. Classification accuracy nominally exceeds chance level for the majority of the participants, but fails to reach statistical significance. Future work should investigate whether individual differences in classification accuracy may relate to other characteristics of the children, such as their cognitive, speech or reading abilities.
- Published
- 2018
- Full Text
- View/download PDF
14. Decoding semantic representations from functional near-infrared spectroscopy signals.
- Author
-
Zinszer BD, Bayet L, Emberson LL, Raizada RDS, and Aslin RN
- Abstract
This study uses representational similarity-based neural decoding to test whether semantic information elicited by words and pictures is encoded in functional near-infrared spectroscopy (fNIRS) data. In experiment 1, subjects passively viewed eight audiovisual word and picture stimuli for 15 min. Blood oxygen levels were measured using the Hitachi ETG-4000 fNIRS system with a posterior array over the occipital lobe and a left lateral array over the temporal lobe. Each participant's response patterns were abstracted to representational similarity space and compared to the group average (excluding that subject, i.e., leave-one-out cross-validation) and to a distributional model of semantic representation. Mean accuracy for both decoding tasks significantly exceeded chance. In experiment 2, we compared three group-level models by averaging the similarity structures from sets of eight participants in each group. In these models, the posterior array was accurately decoded by the semantic model, while the lateral array was accurately decoded in the between-groups comparison. Our findings indicate that semantic representations are encoded in the fNIRS data, preserved across subjects, and decodable by an extrinsic representational model. These results are the first attempt to link the functional response pattern measured by fNIRS to higher-level representations of how words are related to each other.
- Published
- 2018
- Full Text
- View/download PDF
15. Fearful but not happy expressions boost face detection in human infants.
- Author
-
Bayet L, Quinn PC, Laboissière R, Caldara R, Lee K, and Pascalis O
- Subjects
- Face, Humans, Infant, Attention, Facial Expression, Fear, Happiness
- Abstract
Human adults show an attentional bias towards fearful faces, an adaptive behaviour that relies on amygdala function. This attentional bias emerges in infancy between 5 and 7 months, but the underlying developmental mechanism is unknown. To examine possible precursors, we investigated whether 3.5-, 6- and 12-month-old infants show facilitated detection of fearful faces in noise, compared to happy faces. Happy or fearful faces, mixed with noise, were presented to infants ( N = 192), paired with pure noise. We applied multivariate pattern analyses to several measures of infant looking behaviour to derive a criterion-free, continuous measure of face detection evidence in each trial. Analyses of the resulting psychometric curves supported the hypothesis of a detection advantage for fearful faces compared to happy faces, from 3.5 months of age and across all age groups. Overall, our data show a readiness to detect fearful faces (compared to happy faces) in younger infants that developmentally precedes the previously documented attentional bias to fearful faces in older infants and adults., (© 2017 The Author(s).)
- Published
- 2017
- Full Text
- View/download PDF
16. Can human eyes prevent perceptual narrowing for monkey faces in human infants?
- Author
-
Damon F, Bayet L, Quinn PC, Hillairet de Boisferon A, Méary D, Dupierrix E, Lee K, and Pascalis O
- Subjects
- Age Factors, Animals, Child Development, Discrimination, Psychological, Eye anatomy & histology, Female, Haplorhini, Humans, Infant, Male, Photic Stimulation, Facial Recognition, Psychology, Child
- Abstract
Perceptual narrowing has been observed in human infants for monkey faces: 6-month-olds can discriminate between them, whereas older infants from 9 months of age display difficulty discriminating between them. The difficulty infants from 9 months have processing monkey faces has not been clearly identified. It could be due to the structural characteristics of monkey faces, particularly the key facial features that differ from human faces. The current study aimed to investigate whether the information conveyed by the eyes is of importance. We examined whether the presence of Caucasian human eyes in monkey faces allows recognition to be maintained in 6-month-olds and facilitates recognition in 9- and 12-month-olds. Our results revealed that the presence of human eyes in monkey faces maintains recognition for those faces at 6 months of age and partially facilitates recognition of those faces at 9 months of age, but not at 12 months of age. The findings are interpreted in the context of perceptual narrowing and suggest that the attenuation of processing of other-species faces is not reversed by the presence of human eyes., (© 2015 Wiley Periodicals, Inc.)
- Published
- 2015
- Full Text
- View/download PDF
17. Face Gender Influences the Looking Preference for Smiling Expressions in 3.5-Month-Old Human Infants.
- Author
-
Bayet L, Quinn PC, Tanaka JW, Lee K, Gentaz É, and Pascalis O
- Subjects
- Female, Humans, Infant, Male, Sex Factors, Discrimination, Psychological, Face, Infant Behavior psychology, Smiling, Visual Perception physiology
- Abstract
Young infants are typically thought to prefer looking at smiling expressions. Although some accounts suggest that the preference is automatic and universal, we hypothesized that it is not rigid and may be influenced by other face dimensions, most notably the face's gender. Infants are sensitive to the gender of faces; for example, 3-month-olds raised by female caregivers typically prefer female over male faces. We presented neutral versus smiling pairs of faces from the same female or male individuals to 3.5-month-old infants (n = 25), controlling for low-level cues. Infants looked longer to the smiling face when faces were female but longer to the neutral face when faces were male, i.e., there was an effect of face gender on the looking preference for smiling. The results indicate that a preference for smiling in 3.5-month-olds is limited to female faces, possibly reflective of differential experience with male and female faces.
- Published
- 2015
- Full Text
- View/download PDF
18. Subjective report of eye fixations during serial search.
- Author
-
Marti S, Bayet L, and Dehaene S
- Subjects
- Adult, Attention, Eye Movement Measurements, Eye Movements, Female, Humans, Male, Pattern Recognition, Visual, Young Adult, Fixation, Ocular
- Abstract
Humans readily introspect upon their thoughts and their behavior, but how reliable are these subjective reports? In the present study, we explored the consistencies of and differences between the observer's subjective report and actual behavior within a single trial. On each trial of a serial search task, we recorded eye movements and the participants' beliefs of where their eyes moved. The comparison of reported versus real eye movements revealed that subjects successfully reported a subset of their eye movements. Limits in subjective reports stemmed from both the number and the type of eye movements. Furthermore, subjects sometimes reported eye movements they actually never made. A detailed examination of these reports suggests that they could reflect covert shifts of attention during overt serial search. Our data provide quantitative and qualitative measures of observers' subjective reports and reveal experimental effects of visual search that would otherwise be inaccessible., (Copyright © 2014 Elsevier Inc. All rights reserved.)
- Published
- 2015
- Full Text
- View/download PDF
19. Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence.
- Author
-
Bayet L, Pascalis O, Quinn PC, Lee K, Gentaz É, and Tanaka JW
- Abstract
Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward "male" responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1-2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.