Back to Search Start Over

Online Gradient Boosting

Authors :
Beygelzimer, Alina
Hazan, Elad
Kale, Satyen
Luo, Haipeng
Publication Year :
2015

Abstract

We extend the theory of boosting for regression problems to the online learning setting. Generalizing from the batch setting for boosting, the notion of a weak learning algorithm is modeled as an online learning algorithm with linear loss functions that competes with a base class of regression functions, while a strong learning algorithm is an online learning algorithm with convex loss functions that competes with a larger class of regression functions. Our main result is an online gradient boosting algorithm which converts a weak online learning algorithm into a strong one where the larger class of functions is the linear span of the base class. We also give a simpler boosting algorithm that converts a weak online learning algorithm into a strong one where the larger class of functions is the convex hull of the base class, and prove its optimality.

Subjects

Subjects :
Computer Science - Learning

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1506.04820
Document Type :
Working Paper