Back to Search Start Over

Learning to Tune XGBoost with XGBoost

Authors :
Sommer, Johanna
Sarigiannis, Dimitrios
Parnell, Thomas
Publication Year :
2019

Abstract

In this short paper we investigate whether meta-learning techniques can be used to more effectively tune the hyperparameters of machine learning models using successive halving (SH). We propose a novel variant of the SH algorithm (MeSH), that uses meta-regressors to determine which candidate configurations should be eliminated at each round. We apply MeSH to the problem of tuning the hyperparameters of a gradient-boosted decision tree model. By training and tuning our meta-regressors using existing tuning jobs from 95 datasets, we demonstrate that MeSH can often find a superior solution to both SH and random search.<br />Comment: Accepted for presentation at The 3rd Workshop on Meta-Learning (Meta-Learn 2019), Vancouver, Canada

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1909.07218
Document Type :
Working Paper