1. Computer vision‐based tree trunk and branch identification and shaking points detection in Dense‐Foliage canopy for automated harvesting of apples.
- Author
-
Zhang, Xin, Karkee, Manoj, Zhang, Qin, and Whiting, Matthew D.
- Subjects
TREE trunks ,TREE branches ,CONVOLUTIONAL neural networks ,COMPUTER vision ,APPLES ,SEASONAL employment - Abstract
Fresh market apples are one of the high‐value crops in the United States. Washington alone has produced two‐thirds of the annual national production in the past 10 years. However, the availability of seasonal labor is increasingly uncertain. Shake‐and‐catch automated harvesting solutions have, therefore, become attractive for addressing this challenge. One of the significant challenges in applying this harvesting system is effectively positioning the end‐effector at appropriate excitation locations. A computer vision system was used for automatically identifying appropriate locations. Convolutional neural networks (CNNs) were utilized to identify the tree trunks and branches for supporting the automated excitation locations determination. Three CNN architectures were employed: Deeplab v3+ ResNet‐18, VGG‐16, and VGG‐19. Four pixel classes were predefined as branches, trunks, apples, and leaves to segment the canopies trained to simple, narrow, accessible, and productive tree architectures with varying foliage density. Results on Fuji cultivar showed that ResNet‐18 outperformed the VGGs in identifying branches and trunks based on all three evaluation measures: per‐class accuracy (PcA), intersection over union (IoU), and boundary‐F1 score (BFScore). ResNet‐18 achieved a PcA of 97%, IoU of 0.69, and BFScore of 0.89. The ResNet‐18 was further evaluated for its robustness with other test canopy images. When applied this method to one of the highest density cultivars of Scifresh, results showed it can achieve IoUs of 0.41 and 0.62 and BFScores of 0.71 and 0.86 for branches and trunks. Such identification result helped to get a 72% of auto‐determined shaking points being the "good" category identified by human experts. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF