Back to Search Start Over

Accelerated Shapley Value Approximation for Data Evaluation

Authors :
Watson, Lauren
Kujawa, Zeno
Andreeva, Rayna
Yang, Hao-Tsung
Elahi, Tariq
Sarkar, Rik
Publication Year :
2023

Abstract

Data valuation has found various applications in machine learning, such as data filtering, efficient learning and incentives for data sharing. The most popular current approach to data valuation is the Shapley value. While popular for its various applications, Shapley value is computationally expensive even to approximate, as it requires repeated iterations of training models on different subsets of data. In this paper we show that the Shapley value of data points can be approximated more efficiently by leveraging the structural properties of machine learning problems. We derive convergence guarantees on the accuracy of the approximate Shapley value for different learning settings including Stochastic Gradient Descent with convex and non-convex loss functions. Our analysis suggests that in fact models trained on small subsets are more important in the context of data valuation. Based on this idea, we describe $\delta$-Shapley -- a strategy of only using small subsets for the approximation. Experiments show that this approach preserves approximate value and rank of data, while achieving speedup of up to 9.9x. In pre-trained networks the approach is found to bring more efficiency in terms of accurate evaluation using small subsets.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.05346
Document Type :
Working Paper