Back to Search Start Over

SimLM: Pre-training with Representation Bottleneck for Dense Passage Retrieval

Authors :
Wang, Liang
Yang, Nan
Huang, Xiaolong
Jiao, Binxing
Yang, Linjun
Jiang, Daxin
Majumder, Rangan
Wei, Furu
Wang, Liang
Yang, Nan
Huang, Xiaolong
Jiao, Binxing
Yang, Linjun
Jiang, Daxin
Majumder, Rangan
Wei, Furu
Publication Year :
2022

Abstract

In this paper, we propose SimLM (Similarity matching with Language Model pre-training), a simple yet effective pre-training method for dense passage retrieval. It employs a simple bottleneck architecture that learns to compress the passage information into a dense vector through self-supervised pre-training. We use a replaced language modeling objective, which is inspired by ELECTRA, to improve the sample efficiency and reduce the mismatch of the input distribution between pre-training and fine-tuning. SimLM only requires access to unlabeled corpus, and is more broadly applicable when there are no labeled data or queries. We conduct experiments on several large-scale passage retrieval datasets, and show substantial improvements over strong baselines under various settings. Remarkably, SimLM even outperforms multi-vector approaches such as ColBERTv2 which incurs significantly more storage cost. Our code and model check points are available at https://github.com/microsoft/unilm/tree/master/simlm .<br />Comment: Accepted to ACL 2023

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381551883
Document Type :
Electronic Resource