Back to Search Start Over

Domain structure-based transfer learning for cross-domain word representation.

Authors :
Huang, Heyan
Liu, Qian
Source :
Information Fusion. Dec2021, Vol. 76, p145-156. 12p.
Publication Year :
2021

Abstract

Cross-domain word representation aims to learn high-quality semantic representations in an under-resourced domain by leveraging information in a resourceful domain. However, most existing methods mainly transfer the semantics of common words across domains, ignoring the semantic relations among domain-specific words. In this paper, we propose a domain structure-based transfer learning method to learn cross-domain representations by leveraging the relations among domain-specific words. To accomplish this, we first construct a semantic graph to capture the latent domain structure using domain-specific co-occurrence information. Then, in the domain adaptation process, beyond domain alignment, we employ Laplacian Eigenmaps to ensure the domain structure is consistently distributed in the learned embedding space. As such, the learned cross-domain word representations not only capture shared semantics across domains, but also maintain the latent domain structure. We performed extensive experiments on two tasks, namely sentiment analysis and query expansion. The experiment results show the effectiveness of our method for tasks in under-resourced domains. • A cross-domain word representation method for tasks with limited resources. • Fusing the latent semantics of the source and target domains. • Propagating common semantic knowledge from source domain to target domain. • Preserving domain structure in the domain adaptation process. • Effective and efficient for text-related and low-resource tasks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15662535
Volume :
76
Database :
Academic Search Index
Journal :
Information Fusion
Publication Type :
Academic Journal
Accession number :
151815999
Full Text :
https://doi.org/10.1016/j.inffus.2021.05.013