Back to Search Start Over

An Automated Training of Deep Learning Networks by 3D Virtual Models for Object Recognition.

Authors :
Židek, Kamil
Lazorík, Peter
Piteľ, Ján
Hošovský, Alexander
Source :
Symmetry (20738994); Apr2019, Vol. 11 Issue 4, p496-496, 1p
Publication Year :
2019

Abstract

Small series production with a high level of variability is not suitable for full automation. So, a manual assembly process must be used, which can be improved by cooperative robots and assisted by augmented reality devices. The assisted assembly process needs reliable object recognition implementation. Currently used technologies with markers do not work reliably with objects without distinctive texture, for example, screws, nuts, and washers (single colored parts). The methodology presented in the paper introduces a new approach to object detection using deep learning networks trained remotely by 3D virtual models. Remote web application generates training input datasets from virtual 3D models. This new approach was evaluated by two different neural network models (Faster RCNN Inception v2 with SSD, MobileNet V2 with SSD). The main advantage of this approach is the very fast preparation of the 2D sample training dataset from virtual 3D models. The whole process can run in Cloud. The experiments were conducted with standard parts (nuts, screws, washers) and the recognition precision achieved was comparable with training by real samples. The learned models were tested by two different embedded devices with an Android operating system: Virtual Reality (VR) glasses, Cardboard (Samsung S7), and Augmented Reality (AR) smart glasses (Epson Moverio M350). The recognition processing delays of the learned models running in embedded devices based on an ARM processor and standard x86 processing unit were also tested for performance comparison. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20738994
Volume :
11
Issue :
4
Database :
Complementary Index
Journal :
Symmetry (20738994)
Publication Type :
Academic Journal
Accession number :
136175965
Full Text :
https://doi.org/10.3390/sym11040496