1. Discriminative dictionary learning for nonnegative representation based classification.
- Author
-
Qu, Xiwen, Huang, Jun, and Cheng, Zekai
- Subjects
- *
ENCYCLOPEDIAS & dictionaries , *ARTIFICIAL neural networks , *CLASSIFICATION - Abstract
Representation based classification (RC) algorithms have been successfully applied to pattern classification. However, most existing RC algorithms are not robust to bad training samples, since they ignore learning more discriminative dictionary atoms and instead directly use training samples as dictionary atoms. In addition, in order to improve expression ability and recognition accuracy, RC algorithms often need to expand the number of dictionary atoms, resulting in a surge in storage and computing costs. To obtain a more discriminative and compact dictionary, this study proposes discriminative dictionary learning for nonnegative representation based classification (DDLNRC). Specifically, in the DDLNRC, this paper utilizes nonnegative constraint to obtain a nonnegative representation for each training sample on the dictionary. In the dictionary learning stage, for a training sample, the DDLNRC minimizes the intra-class reconstruction error of the training sample and simultaneously enlarges the distance between the training sample and the atom that has the most influence on inter-class reconstruction error. Experiments demonstrate the effectiveness of the DDLNRC. Combined with the deep neural network features, it also can achieve higher accuracy than using the Softmax. • This paper proposes a DDLNRC for pattern recognition. • The DDLNRC can minimize intra-class reconstruction error. • The DDLNRC also can enlarge the inter-class reconstruction error. • By DDLNRC, a discriminative and compact dictionary can be obtained. • Experiments demonstrate the effectiveness of the DDLNRC on pattern recognition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF