Back to Search Start Over

LongEmbed: Extending Embedding Models for Long Context Retrieval

Authors :
Zhu, Dawei
Wang, Liang
Yang, Nan
Song, Yifan
Wu, Wenhao
Wei, Furu
Li, Sujian
Zhu, Dawei
Wang, Liang
Yang, Nan
Song, Yifan
Wu, Wenhao
Wei, Furu
Li, Sujian
Publication Year :
2024

Abstract

Embedding models play a pivot role in modern NLP applications such as IR and RAG. While the context limit of LLMs has been pushed beyond 1 million tokens, embedding models are still confined to a narrow context window not exceeding 8k tokens, refrained from application scenarios requiring long inputs such as legal contracts. This paper explores context window extension of existing embedding models, pushing the limit to 32k without requiring additional training. First, we examine the performance of current embedding models for long context retrieval on our newly constructed LongEmbed benchmark. LongEmbed comprises two synthetic tasks and four carefully chosen real-world tasks, featuring documents of varying length and dispersed target information. Benchmarking results underscore huge room for improvement in these models. Based on this, comprehensive experiments show that training-free context window extension strategies like position interpolation can effectively extend the context window of existing embedding models by several folds, regardless of their original context being 512 or beyond 4k. Furthermore, for models employing absolute position encoding (APE), we show the possibility of further fine-tuning to harvest notable performance gains while strictly preserving original behavior for short inputs. For models using rotary position embedding (RoPE), significant enhancements are observed when employing RoPE-specific methods, such as NTK and SelfExtend, indicating RoPE's superiority over APE for context window extension. To facilitate future research, we release E5-Base-4k and E5-RoPE-Base, along with the LongEmbed benchmark.<br />Comment: Fix results for Nomic

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438548344
Document Type :
Electronic Resource