Back to Search
Start Over
Choosing the $p$ in $L_p$ loss: rate adaptivity on the symmetric location problem
- Publication Year :
- 2023
-
Abstract
- Given univariate random variables $Y_1, \ldots, Y_n$ with the $\text{Uniform}(\theta_0 - 1, \theta_0 + 1)$ distribution, the sample midrange $\frac{Y_{(n)}+Y_{(1)}}{2}$ is the MLE for $\theta_0$ and estimates $\theta_0$ with error of order $1/n$, which is much smaller compared with the $1/\sqrt{n}$ error rate of the usual sample mean estimator. However, the sample midrange performs poorly when the data has say the Gaussian $N(\theta_0, 1)$ distribution, with an error rate of $1/\sqrt{\log n}$. In this paper, we propose an estimator of the location $\theta_0$ with a rate of convergence that can, in many settings, adapt to the underlying distribution which we assume to be symmetric around $\theta_0$ but is otherwise unknown. When the underlying distribution is compactly supported, we show that our estimator attains a rate of convergence of $n^{-\frac{1}{\alpha}}$ up to polylog factors, where the rate parameter $\alpha$ can take on any value in $(0, 2]$ and depends on the moments of the underlying distribution. Our estimator is formed by the $\ell^\gamma$-center of the data, for a $\gamma\geq2$ chosen in a data-driven way -- by minimizing a criterion motivated by the asymptotic variance. Our approach can be directly applied to the regression setting where $\theta_0$ is a function of observed features and motivates the use of $\ell^\gamma$ loss function for $\gamma > 2$ in certain settings.<br />Comment: 60 pages; 7 figures
- Subjects :
- Mathematics - Statistics Theory
Statistics - Methodology
62F10
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2303.01992
- Document Type :
- Working Paper