1. Application a Committee of Kohonen Neural Networks to Training of Image Classifier Based on Description of Descriptors Set
- Author
-
Volodymyr Gorokhovatskyi, Iryna Tvoroshenko, Olena Yakovleva, Monika Hudakova, and Oleksii Gorokhovatskyi
- Subjects
Classification accuracy ,classifier training ,image classification ,Kohonen network ,set of descriptors ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The research aims to improve structural methods of image classification based on a description as a set of keypoint descriptors. The focus is on implementing a classifier training using a committee of Kohonen networks separately for each description in an etalon database. The training result is a fixed set of data centroids, which ensures high classification speed. The improvement comprises implementing independent training for each etalon, which increases the accuracy of approximating descriptions with a set of centroids and, in general, guarantees that the classification efficiency remains at a decent level. Calculating the cluster centroids for each class prevents the influence of descriptors from other classes. In addition, independent training is effective in cases when the powers of etalon descriptions differ. The classification speed of the proposed method based on training a committee of Kohonen networks, compared to the traditional linear search method, increases proportionally to the ratio of the description power and the number of generated centroids. The presented results of experimental modeling of the proposed methods comprise a database of images of coins. The test sample is formed as a set of images from the etalon database by applying geometric transformations of shift, scale, and rotation. The testing stage has shown a high level of classification accuracy after a proposed training and revealed a practical opportunity to choose a network structure and parameters capable of providing the required level of accuracy and speed criteria for classification for an applied task.
- Published
- 2024
- Full Text
- View/download PDF