Back to Search Start Over

未成熟芒果的改进 YOLOv2 识别方法.

Authors :
薛月菊
黄宁
涂淑琴
毛亮
杨阿
朱勋沐
杨晓帆
陈鹏飞
Source :
Transactions of the Chinese Society of Agricultural Engineering. 2018, Vol. 34 Issue 7, p173-179. 7p.
Publication Year :
2018

Abstract

Automatic target detection of immature mangoes on the trees is one of the key steps of early estimation or intelligent spraying of mangoes. In orchard scenes, it is extremely difficult to detect immature mangoes because of variability of light, complex background and high color similarity between mangoes and leaves. Especially, detection of occluded and overlapped mangoes is a challenging task. An immature mango detection model based on improved YOLOv2 with high speed and accuracy was proposed. By introducing dense connectivity pattern into Tiny-yolo, Tiny-yolo network with dense blocks (Tiny-yolo-dense) was designed. And the dense block was used to replace the 7thconvolutional layer with low-resolution feature map in Tiny-yolo to avoid adding more extra computational complexity. In the dense block, each layer receives the feature-maps of all preceding layers and passes on its own feature-map to all subsequent layers, which improves flow of information, and encourages feature reuse and multi-level features fusion. To overcome the difficulty of occluded or overlapped mango detection, foreground region samples of occluded or overlapped fruits were manually labeled, and then the foreground region samples were used to train YOLOv2 model. Feature extraction only for foreground region can avoid extracting redundant feature of non-target region in bounding box, which can enhance feature learning from foreground region of occluded or overlapped fruits. The steps of our work were stated as follows: Firstly, to reduce the influence of natural lighting variations on mango images, adaptive histogram equalization was used to improve the quality of training sample images and the variety of illumination. And foreground region and bounding box of training samples were manually labeled. The original training set including 3 702 bounding box labels of all mango targets and 1 077 foreground region labels of occluded or overlapped fruits from 660 images was constructed, and testing set for model validation including 1 713 bounding box labels of mango targets from other 300 images was also set up. In addition, training data size was augmented by horizontal flipping and multi-angle rotating according to the variety of suspension postures of mangoes. Image rotation with angles of ±10° and ±20° was used for data augmentation. And then, Tiny-yolo-dense was applied as a basic network to design the improved YOLOv2 for the mango detection. Moreover, by higher resolution inputs and multi-scale strategy, foreground region and bounding box of training samples were used to train improved YOLOv2 network. Every 10 batches, the present training scale was exchanged randomly by another one before continuing training. In this paper, input scale of 512×512 and training scales of {384, 416……672} were selected. Lastly, the trained networks were used to detect mango on testing set. The experimental results showed that the proposed algorithm had better performance for mango detection in natural scenes. At a detection rate of 83 fps, the precision rate reached up to 97.02%, and the recall rate reached up to 95.10%. Our algorithm was 11 times faster than the Faster RCNN(circular convolutional neural network), and achieved better detection performance in condition of complex scenes with occlusion and overlapping of mangoes. Moreover, our method also significantly outperforms YOLOv2 and Adaboost classifier based on HOG(histogram of oriented gradient) features. Our work provides an effective method to quickly detect mango under orchard scenes. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
10026819
Volume :
34
Issue :
7
Database :
Academic Search Index
Journal :
Transactions of the Chinese Society of Agricultural Engineering
Publication Type :
Academic Journal
Accession number :
129041361
Full Text :
https://doi.org/10.11975/j.issn.1002-6819.2018.07.022