Back to Search
Start Over
Cross-validation pitfalls when selecting and assessing regression and classification models.
- Source :
-
Journal of cheminformatics [J Cheminform] 2014 Mar 29; Vol. 6 (1), pp. 10. Date of Electronic Publication: 2014 Mar 29. - Publication Year :
- 2014
-
Abstract
- Background: We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches.<br />Methods: We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case.<br />Results: We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models.<br />Conclusions: We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.
Details
- Language :
- English
- ISSN :
- 1758-2946
- Volume :
- 6
- Issue :
- 1
- Database :
- MEDLINE
- Journal :
- Journal of cheminformatics
- Publication Type :
- Academic Journal
- Accession number :
- 24678909
- Full Text :
- https://doi.org/10.1186/1758-2946-6-10