Back to Search Start Over

Online Active Regression

Authors :
Chen, Cheng
Li, Yi
Sun, Yiming
Publication Year :
2022

Abstract

Active regression considers a linear regression problem where the learner receives a large number of data points but can only observe a small number of labels. Since online algorithms can deal with incremental training data and take advantage of low computational cost, we consider an online extension of the active regression problem: the learner receives data points one by one and immediately decides whether it should collect the corresponding labels. The goal is to efficiently maintain the regression of received data points with a small budget of label queries. We propose novel algorithms for this problem under $\ell_p$ loss where $p\in[1,2]$. To achieve a $(1+\epsilon)$-approximate solution, our proposed algorithms only require $\tilde{\mathcal{O}}(\epsilon^{-1} d \log(n\kappa))$ queries of labels, where $n$ is the number of data points and $\kappa$ is a quantity, called the condition number, of the data points. The numerical results verify our theoretical results and show that our methods have comparable performance with offline active regression algorithms.<br />Comment: A preliminary version appeared in the Proceedings of the 39th International Conference on Machine Learning (ICML 2022), PMLR 162, pp 3320--3335, 2022. v2: optimal dependence on $\epsilon$ in query complexity

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2207.05945
Document Type :
Working Paper