1. An Upper Bound of the Bias of Nadaraya–Watson Kernel Regression under Lipschitz Assumptions
- Author
-
Jan Peters, Riad Akrour, and Samuele Tosatto
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Statistics::Theory ,bias ,media_common.quotation_subject ,Machine Learning (stat.ML) ,01 natural sciences ,Upper and lower bounds ,Machine Learning (cs.LG) ,010104 statistics & probability ,Statistics - Machine Learning ,0502 economics and business ,Applied mathematics ,Simplicity ,0101 mathematics ,lcsh:Statistics ,lcsh:HA1-4737 ,050205 econometrics ,media_common ,Mathematics ,05 social sciences ,Estimator ,Lipschitz continuity ,Regression ,Nonparametric regression ,Nadaraya-Watson kernel regression ,nonparametric regression ,Kernel (statistics) ,Kernel regression - Abstract
The Nadaraya&ndash, Watson kernel estimator is among the most popular nonparameteric regression technique thanks to its simplicity. Its asymptotic bias has been studied by Rosenblatt in 1969 and has been reported in several related literature. However, given its asymptotic nature, it gives no access to a hard bound. The increasing popularity of predictive tools for automated decision-making surges the need for hard (non-probabilistic) guarantees. To alleviate this issue, we propose an upper bound of the bias which holds for finite bandwidths using Lipschitz assumptions and mitigating some of the prerequisites of Rosenblatt&rsquo, s analysis. Our bound has potential applications in fields like surgical robots or self-driving cars, where some hard guarantees on the prediction-error are needed.
- Published
- 2021