Back to Search
Start Over
基于深度学习的葡萄果穗检测.
- Source :
-
Science Technology & Engineering . 2023, Vol. 23 Issue 8, p3216-3223. 8p. - Publication Year :
- 2023
-
Abstract
- Fruit detection is a popular key technology in agricultural automatic picking. Aiming at the problems of perishable grapes at maturity, different ripening conditions, complex background of grape orchards and changeable lighting conditions, a lightweight improved detection and recognition method based on YOLO v5s algorithm was proposed. Firstly, Efficientnet-v2 network was used as the backbone of feature extraction and integrated the no dimensionality reduction local cross-channel interaction mechanism, which greatly reduced the size of the model and the amount of parameters on the premise of ensuring the accuracy, and speeded up the model reasoning speed. Secondly, in order to further compensate for the accuracy loss caused by model simplification, coordinate attention mechanism was introduced at the key position of model feature fusion to strengthen the attention to the target, improved the model􀆳s ability to detect dense targets and resist complex background interference, and ensured the comprehensive performance and reliability of the algorithm. The experimental results show that the average accuracy of the improved algorithm is 98. 7%, the average detection speed is 0. 028 s, and the model size is only 12. 01 MB. Compared with the improved algorithm, the accuracy is increased by 0. 41%, the detection speed is increased by 22%, and the model is reduced by 13. 2%. In the orchard scene image detection test, the proposed algorithm can well detect the grape fruit and identify its status, and has strong adaptability to different environmental impacts, which provides a reference for the development of automatic picking technology. [ABSTRACT FROM AUTHOR]
Details
- Language :
- Chinese
- ISSN :
- 16711815
- Volume :
- 23
- Issue :
- 8
- Database :
- Academic Search Index
- Journal :
- Science Technology & Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 163243071