Back to Search Start Over

An image restoration and detection method for picking robot based on convolutional auto-encoder.

Authors :
Chen, Jiqing
Zhang, Hongdu
Wang, Zhikui
Wu, Jiahua
Luo, Tian
Wang, Huabin
Long, Teng
Source :
Computers & Electronics in Agriculture. May2022, Vol. 196, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• An image restoration method based on convolutional autoencoder is proposed. • Use the area growth method to divide and code the obstruction according to the contour. • Predict the overall outline of citrus based on the unobstructed citrus outline. • Fill the area to be repaired with the selection method of similar pixels. • The repair rate and image recognition rate have been effectively improved. At present, machine vision and deep learning theories have been widely used in fruit recognition and picking. However, in the process of identification and picking, there are often situations where the target is blocked, existing methods cannot accurately identify, or the identification accuracy rate is low. For improving the recognition rate of fruits under occlusion, this paper proposes an image restoration method based on a convolutional auto-encoder. This method first encodes the obstruction and then determines the general shape of the occluded fruit and compares the general shape with the coding part is combined to determine the area to be repaired. Finally, the area to be repaired is filled with pixels to realize the image repair and recognition of the occlusion map. The average repair rate of the method proposed in this paper is 95.96%, the restoration rate is 3.52% higher than the traditional convolution method, the L2 loss value is 0.63% lower than the traditional convolution method, the average detection accuracy of the restored fruits is 94.77%. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01681699
Volume :
196
Database :
Academic Search Index
Journal :
Computers & Electronics in Agriculture
Publication Type :
Academic Journal
Accession number :
156253020
Full Text :
https://doi.org/10.1016/j.compag.2022.106896