201. Multi-view graph representation with similarity diffusion for general zero-shot learning.
- Author
-
Yu B, Xie C, Tang P, and Duan H
- Subjects
- Humans, Diffusion, Knowledge, Knowledge Bases, Learning, Benchmarking
- Abstract
Zero-shot learning (ZSL) aims to predict unseen classes without using samples of these classes in model training. The ZSL has been widely used in many knowledge-based models and applications to predict various parameters, including categories, subjects, and anomalies, in different domains. Nonetheless, most existing ZSL methods require the pre-defined semantics or attributes of particular data environments. Therefore, these methods are difficult to be applied to general data environments, such as ImageNet and other real-world datasets and applications. Recent research has tried to use open knowledge to enhance the ZSL methods to adapt it to an open data environment. However, the performance of these methods is relatively low, namely the accuracy is normally below 10%, which is due to the inadequate semantics that can be used from open knowledge. Moreover, the latest methods suffer from a significant "semantic gap" problem between the generated features of unseen classes and the real features of seen classes. To this end, this paper proposes a multi-view graph representation with a similarity diffusion model, applying the ZSL tasks to general data environments. This model applies a multi-view graph to enhance the semantics fully and proposes an innovative diffusion method to augment the graph representation. In addition, a feature diffusion method is proposed to augment the multi-view graph representation and bridge the semantic gap to realize zero-shot predicting. The results of numerous experiments in general data environments and on benchmark datasets show that the proposed method can achieve new state-of-the-art results in the field of general zero-shot learning. Furthermore, seven ablation studies analyze the effects of the settings and different modules of the proposed method on its performance in detail and prove the effectiveness of each module., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2023 The Author(s). Published by Elsevier Ltd.. All rights reserved.)
- Published
- 2023
- Full Text
- View/download PDF