Back to Search Start Over

A Lazy Approach for Efficient Index Learning

Authors :
Liu, Guanli
Kulik, Lars
Ma, Xingjun
Qi, Jianzhong
Publication Year :
2021
Publisher :
arXiv, 2021.

Abstract

Learned indices using neural networks have been shown to outperform traditional indices such as B-trees in both query time and memory. However, learning the distribution of a large dataset can be expensive, and updating learned indices is difficult, thus hindering their usage in practical applications. In this paper, we address the efficiency and update issues of learned indices through agile model reuse. We pre-train learned indices over a set of synthetic (rather than real) datasets and propose a novel approach to reuse these pre-trained models for a new (real) dataset. The synthetic datasets are created to cover a large range of different distributions. Given a new dataset DT, we select the learned index of a synthetic dataset similar to DT, to index DT. We show a bound over the indexing error when a pre-trained index is selected. We further show how our techniques can handle data updates and bound the resultant indexing errors. Experimental results on synthetic and real datasets confirm the effectiveness and efficiency of our proposed lazy (model reuse) approach.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....fc3e3f3fb907543cf72941ebc1b33dc5
Full Text :
https://doi.org/10.48550/arxiv.2102.08081