Back to Search Start Over

Boosted Mixture of Experts: An Ensemble Learning Scheme.

Authors :
Avnimelech, Ran
Intrator, Nathan
Source :
Neural Computation. 02/15/99, Vol. 11 Issue 2, p483-497. 15p. 1 Black and White Photograph, 1 Graph.
Publication Year :
1999

Abstract

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hinton, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classical ensemble approaches. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08997667
Volume :
11
Issue :
2
Database :
Academic Search Index
Journal :
Neural Computation
Publication Type :
Academic Journal
Accession number :
1555012
Full Text :
https://doi.org/10.1162/089976699300016737