Back to Search Start Over

Think before you shrink: Alternatives to default shrinkage methods can improve prediction accuracy, calibration and coverage

Authors :
van de Wiel, Mark A.
Leday, Gwenaël G. R.
Hoogland, Jeroen
Heymans, Martijn W.
van Zwet, Erik W.
Zwinderman, Ailko H.
Publication Year :
2023
Publisher :
arXiv, 2023.

Abstract

While shrinkage is essential in high-dimensional settings, its use for low-dimensional regression-based prediction has been debated. It reduces variance, often leading to improved prediction accuracy. However, it also inevitably introduces bias, which may harm two other measures of predictive performance: calibration and coverage of confidence intervals. Much of the criticism stems from the usage of standard shrinkage methods, such as lasso and ridge with a single, cross-validated penalty. Our aim is to show that readily available alternatives can strongly improve predictive performance, in terms of accuracy, calibration or coverage. For linear regression, we use small sample splits of a large, fairly typical epidemiological data set to illustrate this. We show that usage of differential ridge penalties for covariate groups may enhance prediction accuracy, while calibration and coverage benefit from additional shrinkage of the penalties. In the logistic setting, we apply an external simulation to demonstrate that local shrinkage improves calibration with respect to global shrinkage, while providing better prediction accuracy than other solutions, like Firth's correction. The benefits of the alternative shrinkage methods are easily accessible via example implementations using \texttt{mgcv} and \texttt{r-stan}, including the estimation of multiple penalties. A synthetic copy of the large data set is shared for reproducibility.<br />Comment: 35 pages including Supplementary Information

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....8c14bdb7a34cb437407682f49f74b09e
Full Text :
https://doi.org/10.48550/arxiv.2301.09890