Back to Search Start Over

Leveraging Balanced Semantic Embedding for Generative Zero-Shot Learning.

Authors :
Xie GS
Zhang XY
Xiang TZ
Zhao F
Zhang Z
Shao L
Li X
Source :
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2023 Nov; Vol. 34 (11), pp. 9575-9582. Date of Electronic Publication: 2023 Oct 27.
Publication Year :
2023

Abstract

Generative (generalized) zero-shot learning [(G)ZSL] models aim to synthesize unseen class features by using only seen class feature and attribute pairs as training data. However, the generated fake unseen features tend to be dominated by the seen class features and thus classified as seen classes, which can lead to inferior performances under zero-shot learning (ZSL), and unbalanced results under generalized ZSL (GZSL). To address this challenge, we tailor a novel balanced semantic embedding generative network (BSeGN), which incorporates balanced semantic embedding learning into generative learning scenarios in the pursuit of unbiased GZSL. Specifically, we first design a feature-to-semantic embedding module (FEM) to distinguish real seen and fake unseen features collaboratively with the generator in an online manner. We introduce the bidirectional contrastive and balance losses for the FEM learning, which can guarantee a balanced prediction for the interdomain features. In turn, the updated FEM can boost the learning of the generator. Next, we propose a multilevel feature integration module (mFIM) from the cycle-consistency branch of BSeGN, which can mitigate the domain bias through feature enhancement. To the best of our knowledge, this is the first work to explore embedding and generative learning jointly within the field of ZSL. Extensive evaluations on four benchmarks demonstrate the superiority of BSeGN over its state-of-the-art counterparts.

Details

Language :
English
ISSN :
2162-2388
Volume :
34
Issue :
11
Database :
MEDLINE
Journal :
IEEE transactions on neural networks and learning systems
Publication Type :
Academic Journal
Accession number :
36269927
Full Text :
https://doi.org/10.1109/TNNLS.2022.3208525