Back to Search Start Over

AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

Authors :
Freund, Robert M.
Grigas, Paul
Mazumder, Rahul
Publication Year :
2013

Abstract

Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FS$_\varepsilon$), by establishing their precise connections to the Mirror Descent algorithm, which is a first-order method in convex optimization. As a consequence of these connections we obtain novel computational guarantees for these boosting methods. In particular, we characterize convergence bounds of AdaBoost, related to both the margin and log-exponential loss function, for any step-size sequence. Furthermore, this paper presents, for the first time, precise computational complexity results for FS$_\varepsilon$.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1307.1192
Document Type :
Working Paper