1. Deep-learning-based fault detection and recipe optimization for a plastic injection molding process under the class-imbalance problem
- Author
-
Jin Uk Ko, Jinwook Lee, Taehun Kim, Yong Chae Kim, and Byeng D Youn
- Subjects
Human-Computer Interaction ,Computational Mathematics ,Modeling and Simulation ,Computational Mechanics ,Computer Graphics and Computer-Aided Design ,Engineering (miscellaneous) - Abstract
This paper proposes a supervised learning with a class-balancing loss function (SL-CBL) approach for fault detection and feature-similarity-based recipe optimization (FSRO) for a plastic injection molding process. SL-CBL is a novel method that can accurately classify an input sample as a normal or fault condition, even when the training data are severely class-imbalanced. The proposed class-balancing loss function consists of the weighted focal loss and the loss of the F1 score; together, these are used to correctly classify even a small number of faulty samples. SL-CBL is investigated with four classifiers of different structures; the classifiers consist of several fully connected and batch normalization layers. FSRO is an optimization scheme that finds the optimal recipe whose feature is similar to the features of normal samples. The optimal solution is obtained by minimizing the Euclidean distance to the centroid of the normal features. In this research, the proposed SL-CBL and FSRO methods are validated by applying them to an industrial plastic injection molding dataset. The validation results show that the proposed SL-CBL approach achieves the highest F1 score with the lowest misclassification rate, as compared to the alternative methods. When visualizing the feature space, the optimal recipe found by the FSRO scheme was found to be close to the centroid of the normal features, even if the initial recipe is classified as a fault. Furthermore, each variable of the optimized recipe lies within the confidence interval of 3${\rm{\sigma }}$ for the normal condition. This indicates that the optimal recipe is statistically similar to the normal samples.
- Published
- 2023
- Full Text
- View/download PDF