Back to Search Start Over

A bias--variance evaluation framework for information retrieval systems

Authors :
Zhang, Peng
Gao, Hui
Hu, Zeting
Yang, Meng
Song, Dawei
Wang, Jun
Hou, Yuexian
Hu, Bin
Source :
Information Processing & Management. January, 2022, Vol. 59 Issue 1
Publication Year :
2022

Abstract

Keywords Information retrieval; Evaluation metrics; Effectiveness--stability tradeoff Highlights * A unified bias--variance metric evaluates retrieval effectiveness--stability tradeoff. * A generalized bias--variance metric is defined based on across topic and per-topic. * Studying the factors that influence the bias--variance metric (topic grouping, etc.). * Decomposition of a variance can effectively track the source of system instability. Abstract In information retrieval (IR), the improvement of the effectiveness often sacrifices the stability of an IR system. To evaluate the stability, many risk-sensitive metrics have been proposed. Since the theoretical limitations, the current works study the effectiveness and stability separately, and have not explored the effectiveness--stability tradeoff. In this paper, we propose a Bias--Variance Tradeoff Evaluation (BV-Test) framework, based on the bias--variance decomposition of the mean squared error, to measure the overall performance (considering both effectiveness and stability) and the tradeoff between effectiveness and stability of a system. In this framework, we define generalized bias--variance metrics, based on the Cranfield-style experiment set-up where the document collection is fixed (across topics) or the set-up where document collection is a sample (per-topic). Compared with risk-sensitive evaluation methods, our work not only measures the effectiveness--stability tradeoff of a system, but also effectively tracks the source of system instability. Experiments on TREC Ad-hoc track (1993--1999) and Web track (2010--2014) show a clear effectiveness--stability tradeoff across topics and per-topic, and topic grouping and max--min normalization can effectively reduce the bias--variance tradeoff. Experimental results on TREC Session track (2010--2012) also show that the query reformulation and increase of user data are beneficial to both effectiveness and stability simultaneously. Author Affiliation: (a) College of Intelligence and Computing, Tianjin University, Tianjin, China (b) School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China (c) Department of Computer Science, University College London, London, UK (d) School of Information Science and Engineering, Lanzhou University, Lanzhou, China * Corresponding authors. Article History: Received 25 January 2021; Revised 6 August 2021; Accepted 30 August 2021 Byline: Peng Zhang [pzhang@tju.edu.cn] (a,*), Hui Gao (a), Zeting Hu (a), Meng Yang (a), Dawei Song [dawei.song2010@gmail.com] (b,*), Jun Wang (c), Yuexian Hou (a), Bin Hu (d)

Details

Language :
English
ISSN :
03064573
Volume :
59
Issue :
1
Database :
Gale General OneFile
Journal :
Information Processing & Management
Publication Type :
Periodical
Accession number :
edsgcl.683337588
Full Text :
https://doi.org/10.1016/j.ipm.2021.102747