16 results on '"Laetitia Hebert"'
Search Results
2. WormPose: Image synthesis and convolutional networks for pose estimation in C. elegans.
- Author
-
Laetitia Hebert, Tosif Ahamed, Antonio C Costa, Liam O'Shaughnessy, and Greg J Stephens
- Subjects
Biology (General) ,QH301-705.5 - Abstract
An important model system for understanding genes, neurons and behavior, the nematode worm C. elegans naturally moves through a variety of complex postures, for which estimation from video data is challenging. We introduce an open-source Python package, WormPose, for 2D pose estimation in C. elegans, including self-occluded, coiled shapes. We leverage advances in machine vision afforded from convolutional neural networks and introduce a synthetic yet realistic generative model for images of worm posture, thus avoiding the need for human-labeled training. WormPose is effective and adaptable for imaging conditions across worm tracking efforts. We quantify pose estimation using synthetic data as well as N2 and mutant worms in on-food conditions. We further demonstrate WormPose by analyzing long (∼ 8 hour), fast-sampled (∼ 30 Hz) recordings of on-food N2 worms to provide a posture-scale analysis of roaming/dwelling behaviors.
- Published
- 2021
- Full Text
- View/download PDF
3. OrganoidTracker: Efficient cell tracking using machine learning and manual error correction.
- Author
-
Rutger N U Kok, Laetitia Hebert, Guizela Huelsz-Prince, Yvonne J Goos, Xuan Zheng, Katarzyna Bozek, Greg J Stephens, Sander J Tans, and Jeroen S van Zon
- Subjects
Medicine ,Science - Abstract
Time-lapse microscopy is routinely used to follow cells within organoids, allowing direct study of division and differentiation patterns. There is an increasing interest in cell tracking in organoids, which makes it possible to study their growth and homeostasis at the single-cell level. As tracking these cells by hand is prohibitively time consuming, automation using a computer program is required. Unfortunately, organoids have a high cell density and fast cell movement, which makes automated cell tracking difficult. In this work, a semi-automated cell tracker has been developed. To detect the nuclei, we use a machine learning approach based on a convolutional neural network. To form cell trajectories, we link detections at different time points together using a min-cost flow solver. The tracker raises warnings for situations with likely errors. Rapid changes in nucleus volume and position are reported for manual review, as well as cases where nuclei divide, appear and disappear. When the warning system is adjusted such that virtually error-free lineage trees can be obtained, still less than 2% of all detected nuclei positions are marked for manual analysis. This provides an enormous speed boost over manual cell tracking, while still providing tracking data of the same quality as manual tracking.
- Published
- 2020
- Full Text
- View/download PDF
4. Author Correction: Markerless tracking of an entire honey bee colony
- Author
-
Katarzyna Bozek, Laetitia Hebert, Yoann Portugal, Alexander S. Mikheyev, and Greg J. Stephens
- Subjects
Science - Published
- 2021
- Full Text
- View/download PDF
5. Towards Dense Object Tracking in a 2D Honeybee Hive.
- Author
-
Katarzyna Bozek, Laetitia Hebert, Alexander S. Mikheyev, and Greg J. Stephens
- Published
- 2018
- Full Text
- View/download PDF
6. WormPose: Image synthesis and convolutional networks for pose estimation in C. elegans
- Author
-
Antonio de Lisboa Lopes Costa, Liam O'Shaughnessy, Greg J. Stephens, Laetitia Hebert, Tosif Ahamed, Physics of Living Systems, and LaserLaB - Molecular Biophysics
- Subjects
0301 basic medicine ,Nematoda ,Machine vision ,Computer science ,Image Processing ,Velocity ,Markov models ,Social Sciences ,Convolutional neural network ,0302 clinical medicine ,Psychology ,Hidden Markov models ,Biology (General) ,Hidden Markov model ,Ecology ,Artificial neural network ,Animal Behavior ,Physics ,Applied Mathematics ,Simulation and Modeling ,Eukaryota ,Classical Mechanics ,Animal Models ,Generative model ,Computational Theory and Mathematics ,Experimental Organism Systems ,Modeling and Simulation ,Physical Sciences ,Engineering and Technology ,Algorithms ,Research Article ,Computer and Information Sciences ,Neural Networks ,QH301-705.5 ,Imaging Techniques ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Image processing ,Research and Analysis Methods ,Models, Biological ,Synthetic data ,03 medical and health sciences ,Cellular and Molecular Neuroscience ,Motion ,SDG 17 - Partnerships for the Goals ,Model Organisms ,Genetics ,Animals ,Computer Simulation ,Caenorhabditis elegans ,Molecular Biology ,Pose ,Ecology, Evolution, Behavior and Systematics ,Behavior ,business.industry ,Organisms ,Biology and Life Sciences ,Pattern recognition ,Probability theory ,Invertebrates ,030104 developmental biology ,Signal Processing ,Animal Studies ,Caenorhabditis ,Artificial intelligence ,Neural Networks, Computer ,business ,Zoology ,030217 neurology & neurosurgery ,Mathematics ,Neuroscience - Abstract
An important model system for understanding genes, neurons and behavior, the nematode worm C. elegans naturally moves through a variety of complex postures, for which estimation from video data is challenging. We introduce an open-source Python package, WormPose, for 2D pose estimation in C. elegans, including self-occluded, coiled shapes. We leverage advances in machine vision afforded from convolutional neural networks and introduce a synthetic yet realistic generative model for images of worm posture, thus avoiding the need for human-labeled training. WormPose is effective and adaptable for imaging conditions across worm tracking efforts. We quantify pose estimation using synthetic data as well as N2 and mutant worms in on-food conditions. We further demonstrate WormPose by analyzing long (∼ 8 hour), fast-sampled (∼ 30 Hz) recordings of on-food N2 worms to provide a posture-scale analysis of roaming/dwelling behaviors., Author summary Recent advances in machine learning have enabled the high-resolution estimation of bodypoint positions of freely behaving animals, but manual labeling can render these methods imprecise and impractical, especially in highly deformable animals such as the nematode C. elegans. Such animals also frequently coil, resulting in complicated shapes whose ambiguity presents difficulties for standard pose estimation methods. Efficiently solving coiled shapes in C. elegans, exhibited in a variety of important natural contexts, is the primary limiting factor for fully automated high-throughput behavior analysis. WormPose provides pose estimation that works across imaging conditions, naturally complements existing worm trackers, and harnesses the power of deep convolutional networks but with an image generator to automatically provide precise image-centerline pairings for training. We apply WormPose to on-food recordings, finding a near absence of deep δ-turns. We also show that incoherent body motions in the dwell state, which do not translate the worm, have been misidentified as an increase in reversal rate by previous, centroid-based methods. We expect that the combination of a body model and image synthesis demonstrated in WormPose will be both of general interest and important for future progress in precise pose estimation in other slender-bodied and deformable organisms.
- Published
- 2021
7. Markerless tracking of an entire honey bee colony
- Author
-
Yoann Portugal, Laetitia Hebert, Katarzyna Bozek, Greg J. Stephens, Physics of Living Systems, and LaserLaB - Molecular Biophysics
- Subjects
0301 basic medicine ,Computer science ,Science ,General Physics and Astronomy ,Crawling ,Tracking (particle physics) ,Convolutional neural network ,General Biochemistry, Genetics and Molecular Biology ,Article ,03 medical and health sciences ,0302 clinical medicine ,Image processing ,Animals ,Computer vision ,Segmentation ,Author Correction ,Multidisciplinary ,Artificial neural network ,Behavior, Animal ,Orientation (computer vision) ,business.industry ,Computational Biology ,General Chemistry ,Honey bee ,Bees ,Brood ,030104 developmental biology ,Artificial intelligence ,Neural Networks, Computer ,business ,Biological physics ,030217 neurology & neurosurgery - Abstract
From cells in tissue, to bird flocks, to human crowds, living systems display a stunning variety of collective behaviors. Yet quantifying such phenomena first requires tracking a significant fraction of the group members in natural conditions, a substantial and ongoing challenge. We present a comprehensive, computational method for tracking an entire colony of the honey bee Apis mellifera using high-resolution video on a natural honeycomb background. We adapt a convolutional neural network (CNN) segmentation architecture to automatically identify bee and brood cell positions, body orientations and within-cell states. We achieve high accuracy (~10% body width error in position, ~10° error in orientation, and true positive rate > 90%) and demonstrate months-long monitoring of sociometric colony fluctuations. These fluctuations include ~24 h cycles in the counted detections, negative correlation between bee and brood, and nightly enhancement of bees inside comb cells. We combine detected positions with visual features of organism-centered images to track individuals over time and through challenging occluding events, recovering ~79% of bee trajectories from five observation hives over 5 min timespans. The trajectories reveal important individual behaviors, including waggle dances and crawling inside comb cells. Our results provide opportunities for the quantitative study of collective bee behavior and for advancing tracking techniques of crowded systems., Honey bee colonies are hard to automatically monitor due to the high number of visually similar members which move rapidly and whose numbers change over time. Here, the authors report a method for markerless tracking of a bee colony by adapting convolutional neural networks for detection and tracking.
- Published
- 2021
8. Markerless tracking of an entire insect colony
- Author
-
Katarzyna Bozek, Laetitia Hebert, Yoann Portugal, Greg J. Stephens
- Abstract
Data related to thestudyMarkerless tracking of an entire insect colony. The dataset comprises detection and trajectory information from five beehive recordings at 10 fps 5 min segments. Tracking pipeline is described in ourgithub repositoryandsample datawas prepared to run the pipeline.
- Published
- 2021
- Full Text
- View/download PDF
9. Markerless tracking of an entire honey bee colony
- Author
-
Katarzyna Bozek, Laetitia Hebert
- Abstract
Data related to thestudyMarkerless tracking of an entire insect colony. The dataset comprises detection and trajectory information from five beehive recordings at 10 fps 5 min segments. Tracking pipeline is described in ourgithub repositoryandsample datawas prepared to run the pipeline. We include also manuscript Supplementary Movies in this dataset: M1Background extracted from hive L1. Each frame represents 12 h of the original recording. M2Background extracted from hive L5. Each frame represents 12 h of the original recording, predicted brood cells are marked in red. M3-M7. Example bee trajectories with their corresponding trajectories through the space of visual features. The originally 64-dimensional vectors of visual features representing each bee instance are projected into 3D with the use of t-Distributed Stochastic Neighbor Embedding (t-SNE). Analogous to the plot 3B, 10 most recent representations of the tracked bee are marked in red dots and representations of all other bees from three most recent video frames are marked in yellow dots. Examples are from hives S1-S5, respectively. M8-M12. Short video snippets provided to visually illustrate the proportion of tracked individuals in each hives. All reference trajectories in hives: S1-S5, respectively. M13-M15. Bees performing waggle dance, whose trajectories are shown in Fig. 4D. M16-M18. Bees visiting multiple comb cells, whose trajectories are shown in Fig. 4F.
- Published
- 2021
- Full Text
- View/download PDF
10. OrganoidTracker: Efficient cell tracking using machine learning and manual error correction
- Author
-
Rutger N. U., Kok, Laetitia, Hebert, Guizela, Huelsz-Prince, Yvonne J., Goos, Xuan, Zheng, Katarzyna, Bozek, Greg J., Stephens, Sander J., Tans, Jeroen S., van Zon, Rutger N. U., Kok, Laetitia, Hebert, Guizela, Huelsz-Prince, Yvonne J., Goos, Xuan, Zheng, Katarzyna, Bozek, Greg J., Stephens, Sander J., Tans, and Jeroen S., van Zon
- Abstract
Time-lapse microscopy is routinely used to follow cells within organoids, allowing direct study of division and differentiation patterns. There is an increasing interest in cell tracking in organoids, which makes it possible to study their growth and homeostasis at the single-cell level. As tracking these cells by hand is prohibitively time consuming, automation using a computer program is required. Unfortunately, organoids have a high cell density and fast cell movement, which makes automated cell tracking difficult. In this work, a semi-automated cell tracker has been developed. To detect the nuclei, we use a machine learning approach based on a convolutional neural network. To form cell trajectories, we link detections at different time points together using a min-cost flow solver. The tracker raises warnings for situations with likely errors. Rapid changes in nucleus volume and position are reported for manual review, as well as cases where nuclei divide, appear and disappear. When the warning system is adjusted such that virtually error-free lineage trees can be obtained, still less than 2% of all detected nuclei positions are marked for manual analysis. This provides an enormous speed boost over manual cell tracking, while still providing tracking data of the same quality as manual tracking., source:https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0240802
- Published
- 2021
11. WormPose: Image synthesis and convolutional networks for pose estimation in C. elegans
- Author
-
Laetitia, Hebert, Tosif, Ahamed, Antonio C., Costa, Liam, O’Shaughnessy, Greg J., Stephens, Laetitia, Hebert, Tosif, Ahamed, Antonio C., Costa, Liam, O’Shaughnessy, and Greg J., Stephens
- Abstract
An important model system for understanding genes, neurons and behavior, the nematode worm C. elegans naturally moves through a variety of complex postures, for which estimation from video data is challenging. We introduce an open-source Python package, WormPose, for 2D pose estimation in C. elegans, including self-occluded, coiled shapes. We leverage advances in machine vision afforded from convolutional neural networks and introduce a synthetic yet realistic generative model for images of worm posture, thus avoiding the need for human-labeled training. WormPose is effective and adaptable for imaging conditions across worm tracking efforts. We quantify pose estimation using synthetic data as well as N2 and mutant worms in on-food conditions. We further demonstrate WormPose by analyzing long ( approximately ~8 hour), fast-sampled ( approximately ~ 30 Hz) recordings of on-food N2 worms to provide a posture-scale analysis of roaming/dwelling behaviors., source:https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1008914
- Published
- 2021
12. WormPose: Image synthesis and convolutional networks for pose estimation in C. elegans
- Author
-
Antonio de Lisboa Lopes Costa, Greg J. Stephens, Tosif Ahamed, Laetitia Hebert, and Liam O’Shaugnessy
- Subjects
Machine vision ,Computer science ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Pattern recognition ,Python (programming language) ,Convolutional neural network ,Nematode worm ,Generative model ,Leverage (statistics) ,Artificial intelligence ,business ,Pose ,computer ,computer.programming_language - Abstract
An important model system for understanding genes, neurons and behavior, the nematode worm C. elegans naturally moves through a variety of complex postures, for which estimation from video data is challenging. We introduce an open-source Python package, WormPose, for 2D pose estimation in C. elegans, including self-occluded, coiled shapes. We leverage advances in machine vision afforded from convolutional neural networks and introduce a synthetic yet realistic generative model for images of worm posture, thus avoiding the need for human-labeled training. WormPose is effective and adaptable for imaging conditions across worm tracking efforts. We quantify pose estimation using synthetic data as well as N2 and mutant worms in on-food conditions. We further demonstrate WormPose by analyzing long (∼ 10 hour), fast-sampled (∼ 30 Hz) recordings of on-food N2 worms to provide a posture-scale analysis of roaming/dwelling behaviors.
- Published
- 2020
13. Markerless tracking of an entire insect colony
- Author
-
Yoann Portugal, Laetitia Hebert, Greg J. Stephens, and Katarzyna Bozek
- Subjects
Computer science ,Orientation (computer vision) ,business.industry ,media_common.quotation_subject ,Insect ,Honey bee ,Tracking (particle physics) ,Convolutional neural network ,Brood ,Position (vector) ,Computer vision ,Artificial intelligence ,business ,media_common - Abstract
We present a comprehensive, computational method for tracking an entire colony of the honey beeApis melliferausing high-resolution video on a natural honeycomb background. We adapt a convolutional neural network (CNN) segmentation architecture to automatically identify bee and brood cell positions, body orientations and within-cell states. We achieve high accuracy (~10% body width error in position, ~10° error in orientation, and true positive rate > 90%) and demonstrate months-long monitoring of sociometric colony fluctuations. We combine extracted positions with rich visual features of organism-centered images to track individuals over time and through challenging occluding events, recovering ~79% of bee trajectories from five observation hives over a span of 5 minutes. The resulting trajectories reveal important behaviors, including fast motion, comb-cell activity, and waggle dances. Our results provide new opportunities for the quantitative study of collective bee behavior and for advancing tracking techniques of crowded systems.
- Published
- 2020
- Full Text
- View/download PDF
14. Pixel personality for dense object tracking in a 2D honeybee hive
- Author
-
Katarzyna Bozek, Alexander S. Mikheyev, Laetitia Hebert, and Greg J. Stephens
- Subjects
FOS: Computer and information sciences ,Pixel ,Computer science ,business.industry ,Orientation (computer vision) ,Computer Vision and Pattern Recognition (cs.CV) ,Computer Science - Computer Vision and Pattern Recognition ,Cognitive neuroscience of visual object recognition ,Machine Learning (stat.ML) ,Tracking (particle physics) ,Quantitative Biology - Quantitative Methods ,Sample (graphics) ,Object detection ,Statistics - Machine Learning ,FOS: Biological sciences ,Video tracking ,Trajectory ,Computer vision ,Segmentation ,Artificial intelligence ,business ,Quantitative Methods (q-bio.QM) - Abstract
Tracking large numbers of densely-arranged, interacting objects is challenging due to occlusions and the resulting complexity of possible trajectory combinations, as well as the sparsity of relevant, labeled datasets. Here we describe a novel technique of collective tracking in the model environment of a 2D honeybee hive in which sample colonies consist of $N\sim10^3$ highly similar individuals, tightly packed, and in rapid, irregular motion. Such a system offers universal challenges for multi-object tracking, while being conveniently accessible for image recording. We first apply an accurate, segmentation-based object detection method to build initial short trajectory segments by matching object configurations based on class, position and orientation. We then join these tracks into full single object trajectories by creating an object recognition model which is adaptively trained to recognize honeybee individuals through their visual appearance across multiple frames, an attribute we denote as pixel personality. Overall, we reconstruct ~46% of the trajectories in 5 min recordings from two different hives and over 71% of the tracks for at least 2 min. We provide validated trajectories spanning 3000 video frames of 876 unmarked moving bees in two distinct colonies in different locations and filmed with different pixel resolutions, which we expect to be useful in the further development of general-purpose tracking solutions., 13 pages, 4 main and 9 supplementary figures as well as a link to supplementary movies
- Published
- 2019
15. Author Correction: Markerless tracking of an entire honey bee colony
- Author
-
Alexander S. Mikheyev, Greg J. Stephens, Yoann Portugal, Katarzyna Bozek, and Laetitia Hebert
- Subjects
Multidisciplinary ,Promotion (chess) ,Science ,General Physics and Astronomy ,Library science ,General Chemistry ,Tracking (education) ,Method development ,Imaging data ,General Biochemistry, Genetics and Molecular Biology - Abstract
The original version of this Article omitted from the author list the fourth author Alexander S. Mikheyev, who is from the Ecology and Evolution Unit, OIST Graduate University, Okinawa, Japan, and the Research School of Biology, Australian National University, Canberra, Australia. The third author Yoann Portugal has the following additional affiliation: Ecology and Evolution Unit, OIST Graduate University, Okinawa, Japan. The fourth author Alexander S. Mikheyev and the fifth author Greg J. Stephens declare equal contributions. Consequently, the Acknowledgements, which formerly read “We thank Michael Iuzzolino, Dieu My thanh Nguyen, Orit Peleg, and Michael Smith for comments on the manuscript and code testing. This work was supported by the Okinawa Institute of Science and Technology Graduate University”, have been corrected to “We are grateful to Takahashi Ikemiya for maintaining the experimental bee colonies. We thank Michael Iuzzolino, Dieu My Thanh Nguyen, Orit Peleg, and Michael Smith for comments on the manuscript and code testing. This work was supported by the Okinawa Institute of Science and Technology Graduate University. Additional funding was provided by KAKENHI grants 16H06209 and 16KK0175 from the Japan Society for the Promotion of Science to AM”. Additionally, the Author Contributions, which formerly read “Y.P. performed the bee work and devised the imaging setup, L. H. devised the labeling tool, K.B. performed method development and data analysis, K.B. and G.S. designed the study and wrote the manuscript”, has been corrected to “Y.P. performed the bee work, Y.P. and A.M. devised the imaging setup, L.H. devised the labeling tool, K.B. performed method development and data analysis, K.B., A.M., and G.S. designed the study, K.B. and G.S. wrote the manuscript”. This has been corrected in both the PDF and HTML versions of the Article. The original version of the Supplementary information associated with this Article contained an error in the description of Supplementary Table 2, which incorrectly read “All imaging data in this study were collected in 2019”. The correct version states “2018” in place of “2019”. The HTML has been updated to include a corrected version of the Supplementary information.
- Published
- 2021
16. OrganoidTracker: Efficient cell tracking using machine learning and manual error correction
- Author
-
Laetitia Hebert, Guizela Huelsz-Prince, Katarzyna Bozek, Sander J. Tans, Yvonne J. Goos, Xuan Zheng, Jeroen S. van Zon, Greg J. Stephens, Rutger N.U. Kok, Physics of Living Systems, and LaserLaB - Molecular Biophysics
- Subjects
0301 basic medicine ,Computer science ,Tracking (particle physics) ,computer.software_genre ,Convolutional neural network ,Machine Learning ,Automation ,Cognition ,Learning and Memory ,0302 clinical medicine ,Software ,Microscopy ,Medicine and Health Sciences ,Cell Cycle and Cell Division ,Organ Cultures ,Multidisciplinary ,Artificial neural network ,Software Engineering ,Cell Differentiation ,Division (mathematics) ,Organoids ,Cell Tracking ,Cell Processes ,Memory Recall ,Medicine ,Engineering and Technology ,Biological Cultures ,Anatomy ,Algorithms ,Research Article ,Computer and Information Sciences ,Neural Networks ,Imaging Techniques ,Science ,Research and Analysis Methods ,Machine learning ,Computer Software ,03 medical and health sciences ,Memory ,Fluorescence Imaging ,Organoid ,Humans ,business.industry ,Volume (computing) ,Biology and Life Sciences ,Cell Biology ,Gastrointestinal Tract ,030104 developmental biology ,Cognitive Science ,Neural Networks, Computer ,Artificial intelligence ,business ,Error detection and correction ,Digestive System ,computer ,030217 neurology & neurosurgery ,Neuroscience ,Developmental Biology - Abstract
Time-lapse microscopy is routinely used to follow cells within organoids, allowing direct study of division and differentiation patterns. There is an increasing interest in cell tracking in organoids, which makes it possible to study their growth and homeostasis at the single-cell level. As tracking these cells by hand is prohibitively time consuming, automation using a computer program is required. Unfortunately, organoids have a high cell density and fast cell movement, which makes automated cell tracking difficult. In this work, a semi-automated cell tracker has been developed. To detect the nuclei, we use a machine learning approach based on a convolutional neural network. To form cell trajectories, we link detections at different time points together using a min-cost flow solver. The tracker raises warnings for situations with likely errors. Rapid changes in nucleus volume and position are reported for manual review, as well as cases where nuclei divide, appear and disappear. When the warning system is adjusted such that virtually error-free lineage trees can be obtained, still less than 2% of all detected nuclei positions are marked for manual analysis. This provides an enormous speed boost over manual cell tracking, while still providing tracking data of the same quality as manual tracking.
- Published
- 2020
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.