Back to Search Start Over

Enhancing Dual-Encoders with Question and Answer Cross-Embeddings for Answer Retrieval

Authors :
Wang, Yanmeng
Bai, Jun
Wang, Ye
Zhang, Jianfei
Rong, Wenge
Ji, Zongcheng
Wang, Shaojun
Xiao, Jing
Publication Year :
2022

Abstract

Dual-Encoders is a promising mechanism for answer retrieval in question answering (QA) systems. Currently most conventional Dual-Encoders learn the semantic representations of questions and answers merely through matching score. Researchers proposed to introduce the QA interaction features in scoring function but at the cost of low efficiency in inference stage. To keep independent encoding of questions and answers during inference stage, variational auto-encoder is further introduced to reconstruct answers (questions) from question (answer) embeddings as an auxiliary task to enhance QA interaction in representation learning in training stage. However, the needs of text generation and answer retrieval are different, which leads to hardness in training. In this work, we propose a framework to enhance the Dual-Encoders model with question answer cross-embeddings and a novel Geometry Alignment Mechanism (GAM) to align the geometry of embeddings from Dual-Encoders with that from Cross-Encoders. Extensive experimental results show that our framework significantly improves Dual-Encoders model and outperforms the state-of-the-art method on multiple answer retrieval datasets.<br />Comment: Findings of EMNLP 2021(10 pages)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2206.02978
Document Type :
Working Paper
Full Text :
https://doi.org/10.18653/v1/2021.findings-emnlp.198