Back to Search Start Over

A Deep Learning Based Printing Defect Classification Method with Imbalanced Samples.

Authors :
Zhang, Erhu
Li, Bo
Li, Peilin
Chen, Yajun
Source :
Symmetry (20738994); Dec2019, Vol. 11 Issue 12, p1440-1440, 1p
Publication Year :
2019

Abstract

Deep learning has been successfully applied to classification tasks in many fields due to its good performance in learning discriminative features. However, the application of deep learning to printing defect classification is very rare, and there is almost no research on the classification method for printing defects with imbalanced samples. In this paper, we present a deep convolutional neural network model to extract deep features directly from printed image defects. Furthermore, considering the asymmetry in the number of different types of defect samples—that is, the number of different kinds of defect samples is unbalanced—seven types of over-sampling methods were investigated to determine the best method. To verify the practical applications of the proposed deep model and the effectiveness of the extracted features, a large dataset of printing detect samples was built. All samples were collected from practical printing products in the factory. The dataset includes a coarse-grained dataset with four types of printing samples and a fine-grained dataset with eleven types of printing samples. The experimental results show that the proposed deep model achieves a 96.86% classification accuracy rate on the coarse-grained dataset without adopting over-sampling, which is the highest accuracy compared to the well-known deep models based on transfer learning. Moreover, by adopting the proposed deep model combined with the SVM-SMOTE over-sampling method, the accuracy rate is improved by more than 20% in the fine-grained dataset compared to the method without over-sampling. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20738994
Volume :
11
Issue :
12
Database :
Complementary Index
Journal :
Symmetry (20738994)
Publication Type :
Academic Journal
Accession number :
141259374
Full Text :
https://doi.org/10.3390/sym11121440