Back to Search Start Over

Shrinkage Degree in L2 -Rescale Boosting for Regression.

Authors :
Xu, Lin
Lin, Shaobo
Wang, Yao
Xu, Zongben
Source :
IEEE Transactions on Neural Networks & Learning Systems; Aug2017, Vol. 28 Issue 8, p1851-1864, 14p
Publication Year :
2017

Abstract

L2 -rescale boosting ( L2 -RBoosting) is a variant of L2 -Boosting, which can essentially improve the generalization performance of L2 -Boosting. The key feature of L2 -RBoosting lies in introducing a shrinkage degree to rescale the ensemble estimate in each iteration. Thus, the shrinkage degree determines the performance of L2 -RBoosting. The aim of this paper is to develop a concrete analysis concerning how to determine the shrinkage degree in L2 -RBoosting. We propose two feasible ways to select the shrinkage degree. The first one is to parameterize the shrinkage degree and the other one is to develop a data-driven approach. After rigorously analyzing the importance of the shrinkage degree in L2 -RBoosting, we compare the pros and cons of the proposed methods. We find that although these approaches can reach the same learning rates, the structure of the final estimator of the parameterized approach is better, which sometimes yields a better generalization capability when the number of sample is finite. With this, we recommend to parameterize the shrinkage degree of L2 -RBoosting. We also present an adaptive parameter-selection strategy for shrinkage degree and verify its feasibility through both theoretical analysis and numerical verification. The obtained results enhance the understanding of L2 -RBoosting and give guidance on how to use it for regression tasks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
28
Issue :
8
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
124252317
Full Text :
https://doi.org/10.1109/TNNLS.2016.2560224