Back to Search Start Over

A Camera-LiDAR-IMU fusion method for real-time extraction of navigation line between maize field rows.

Authors :
Ban, Chao
Wang, Lin
Chi, Ruijuan
Su, Tong
Ma, Yueqi
Source :
Computers & Electronics in Agriculture. Aug2024, Vol. 223, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

• The proposed multi-sensor fusion method can extract navigation lines from complex seedling maize fields in real time. • The Itti algorithm was improved for extracting green plants from different exposed images. • A non-maize point cloud filtering method that fused attitude angles and the image was proposed. • A green feature map overlaid with a maize point cloud was used to accurately extract the navigation line. The navigation line is crucial for the autonomous navigation of spraying and weeding robots in unstructured and complex agricultural fields. The characteristics of the environment, such as unstable light, uneven terrain, and chaotic weeds, lead to the shortcomings of existing navigation line extraction methods, such as low accuracy or insufficient real-time responsiveness. Therefore, this study presents a method (named CLI-Fusion) based on the fusion of a camera, light detection and ranging (LiDAR), and an inertial measurement unit (IMU) to accurately and in real-time extract the navigation line between maize crop rows during the seedling stage. CLI-Fusion also consists of three steps based on achieving multi-sensor spatial synchronization. First, the Itti algorithm is improved to detect the green salient region in the image. The green salient region is then combined with the Otsu and excess green index (ExG) to generate a green feature map, which can separate the green plants from the ground in the image. Second, weeds and ground in the point cloud are filtered out by fusion with the IMU's attitude angles and green feature map. The retained maize point cloud is sectorized, and centerlines of the crop rows are detected with rough accuracy. Third, the sectorized maize point cloud is projected onto the green feature map, and the edge pixels of the green feature map in the projected area are detected. The least squares method is then used to fit more accurate centerlines of the crop rows to extract the accurate navigation line. The experimental results show that the accuracy rate, angle error, and processing time averages are 90.0 %, 1.84°, and 55.4 ms, respectively, suggesting that CLI-Fusion has better accuracy than the single-sensor methods based on ensuring real-time performance. The multi-sensor fusion method for navigation line extraction provides technical support for the navigation of agricultural robots in farmlands. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01681699
Volume :
223
Database :
Academic Search Index
Journal :
Computers & Electronics in Agriculture
Publication Type :
Academic Journal
Accession number :
177856755
Full Text :
https://doi.org/10.1016/j.compag.2024.109114