Back to Search Start Over

Light Gradient Boosting Machine as a Regression Method for Quantitative Structure-Activity Relationships

Authors :
Sheridan, Robert P.
Liaw, Andy
Tudor, Matthew
Publication Year :
2021

Abstract

In the pharmaceutical industry, where it is common to generate many QSAR models with large numbers of molecules and descriptors, the best QSAR methods are those that can generate the most accurate predictions but that are also insensitive to hyperparameters and are computationally efficient. Here we compare Light Gradient Boosting Machine (LightGBM) to random forest, single-task deep neural nets, and Extreme Gradient Boosting (XGBoost) on 30 in-house data sets. While any boosting algorithm has many adjustable hyperparameters, we can define a set of standard hyperparameters at which LightGBM makes predictions about as accurate as single-task deep neural nets, but is a factor of 1000-fold faster than random forest and ~4-fold faster than XGBoost in terms of total computational time for the largest models. Another very useful feature of LightGBM is that it includes a native method for estimating prediction intervals.<br />Comment: 32 pages, 4 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2105.08626
Document Type :
Working Paper