1. Multi-kernel partial label learning using graph contrast disambiguation.
- Author
-
Li, Hongyan, Wan, Zhonglin, and Vong, Chi Man
- Subjects
OPTIMIZATION algorithms ,DATA distribution ,TASK performance ,AMBIGUITY ,CLASSIFICATION - Abstract
Partial label learning (PLL) handles data classification problems by assigning a candidate label set to each sample. There is always one correct label in a candidate label set. Since the PLL can achieve classification without precise labels, it can reduce the cost of data annotation. However, the PLL suffers from challenges and difficulties caused by the ambiguity of the candidate labels. Most PLL algorithms eliminate the ambiguity in candidate label sets by treating each candidate label equally or iteratively identifying the ground-truth label without employing optimization kernels, whereas the optimal kernel can ensure better performance for PLL tasks. Moreover, there is no general framework to handle heterogeneous data classification in various applications. Inspired by the successful application of multi-kernel learning in machine learning, this paper integrates multi-kernel learning into the PLL framework to develop a new multi-kernel PLL (PL-MKL) algorithm, which adopts different kernels to map the original sample attributes to distinct nonlinear feature spaces. Accordingly, the model's classification performance can be enhanced by joining the mapping capabilities of multiple feature spaces while fully exploring the intrinsic distribution of data. Furthermore, the PL-MKL combines similarity and dispersion graphs to develop an innovative method based on graph contrast disambiguation. This approach maintains the manifold characteristics of the data and reflects the differences in candidate labels, thus alleviating intra-class differences while gaining inter-class ones. An efficient optimization algorithm is proposed to attain the underlined objectives. Extensive experiments demonstrate the competitive or superior performance of the suggested PL-MKL over state-of-the-art approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF