Back to Search Start Over

Composite smoothed quantile regression.

Authors :
Yan, Yibo
Wang, Xiaozhou
Zhang, Riquan
Source :
Stat; Dec2023, Vol. 12 Issue 1, p1-14, 14p
Publication Year :
2023

Abstract

Composite quantile regression (CQR) is an efficient method to estimate parameters of the linear model with non‐Gaussian random noise. The non‐smoothness of CQR loss prevents many efficient algorithms from being used. In this paper, we propose the composite smoothed quantile regression (CSQR) model and investigate the inference problem for a large‐scale dataset, in which the dimensionality p$$ p $$ is allowed to increase with the sample size n$$ n $$ while p/n∈(0,1)$$ p/n\in \left(0,1\right) $$. After applying the convolution smoothing technique to the composite quantile loss, we obtain the convex and twice differentiable CSQR loss function, which can be optimized via the gradient descent algorithm. Theoretically, we establish the non‐asymptotic error bound for the CSQR estimators and further provide the Bahadur representation and the Berry–Esseen bound, from which the asymptotic normality of CSQR estimator can be immediately derived. To make valid inference, we construct the confidence intervals that based on the asymptotic distribution. Besides, we also explore the asymptotic relative efficiency of the CSQR estimator with respect to the standard CQR estimator. At last, we provide extensive numerical experiments on both simulated and real data to demonstrate the good performance of our CSQR estimator compared with some baselines. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20491573
Volume :
12
Issue :
1
Database :
Complementary Index
Journal :
Stat
Publication Type :
Academic Journal
Accession number :
174325278
Full Text :
https://doi.org/10.1002/sta4.542