Back to Search Start Over

Weakly Supervised Pre-Training for Multi-Hop Retriever

Authors :
Yeon Seonwoo
Alice Oh
Ji-Hoon Kim
Jung-Woo Ha
Sang Woo Lee
Source :
ACL/IJCNLP (Findings)
Publication Year :
2021
Publisher :
arXiv, 2021.

Abstract

In multi-hop QA, answering complex questions entails iterative document retrieval for finding the missing entity of the question. The main steps of this process are sub-question detection, document retrieval for the sub-question, and generation of a new query for the final document retrieval. However, building a dataset that contains complex questions with sub-questions and their corresponding documents requires costly human annotation. To address the issue, we propose a new method for weakly supervised multi-hop retriever pre-training without human efforts. Our method includes 1) a pre-training task for generating vector representations of complex questions, 2) a scalable data generation method that produces the nested structure of question and sub-question as weak supervision for pre-training, and 3) a pre-training model structure based on dense encoders. We conduct experiments to compare the performance of our pre-trained retriever with several state-of-the-art models on end-to-end multi-hop QA as well as document retrieval. The experimental results show that our pre-trained retriever is effective and also robust on limited data and computational resources.<br />Comment: ACL-Findings 2021

Details

Database :
OpenAIRE
Journal :
ACL/IJCNLP (Findings)
Accession number :
edsair.doi.dedup.....4300a127a2e68af523d39dbea8f5ff03
Full Text :
https://doi.org/10.48550/arxiv.2106.09983