1. An Efficient and Low-Cost Deep Learning-Based Method for Counting and Sizing Soybean Nodules.
- Author
-
Wang, Xueying, Yu, Nianping, Sun, Yongzhe, Guo, Yixin, Pan, Jinchao, Niu, Jiarui, Liu, Li, Chen, Hongyu, Cao, Junzhuo, Cao, Haifeng, Chen, Qingshan, Xin, Dawei, and Zhu, Rongsheng
- Subjects
NITROGEN fixation ,ROOT-tubercles ,IMAGE segmentation ,DEEP learning ,PLANT growth - Abstract
Soybeans are an essential source of food, protein, and oil worldwide, and the nodules on their root systems play a critical role in nitrogen fixation and plant growth. In this study, we tackled the challenge of limited high-resolution image quantities and the constraints on model learning by innovatively employing image segmentation technology for an in-depth analysis of soybean nodule phenomics. Through a meticulously designed segmentation algorithm, we broke down large-resolution images into numerous smaller ones, effectively improving the model's learning efficiency and significantly increasing the available data volume, thus laying a solid foundation for subsequent analysis. In terms of model selection and optimization, after several rounds of comparison and testing, YOLOX was identified as the optimal model, achieving an accuracy of 91.38% on the test set with an R
2 of up to 86%, fully demonstrating its efficiency and reliability in nodule counting tasks. Subsequently, we utilized YOLOV5 for instance segmentation, achieving a precision of 93.8% in quickly and accurately extracting key phenotypic indicators such as the area, circumference, length, and width of the nodules, and calculated the statistical properties of these indicators. This provided a wealth of quantitative data for the morphological study of soybean nodules. The research not only enhanced the efficiency and accuracy of obtaining nodule phenotypic data and reduced costs but also provided important scientific evidence for the selection and breeding of soybean materials, highlighting its potential application value in agricultural research and practical production. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF