Back to Search Start Over

The performance bounds of learning machines based on exponentially strongly mixing sequences

Authors :
Zou, Bin
Li, Luoqing
Source :
Computers & Mathematics with Applications. Apr2007, Vol. 53 Issue 7, p1050-1058. 9p.
Publication Year :
2007

Abstract

Generalization performance is the main purpose of machine learning theoretical research. It has been shown previously by Vapnik, Cucker and Smale that the empirical risks based on an i.i.d. sequence must uniformly converge on their expected risks for learning machines as the number of samples approaches infinity. In order to study the generalization performance of learning machines under the condition of dependent input sequences, this paper extends these results to the case where the i.i.d. sequence is replaced by exponentially strongly mixing sequence. We obtain the bound on the rate of uniform convergence for learning machines by using Bernstein’s inequality for exponentially strongly mixing sequences, and establishing the bound on the rate of relative uniform convergence for learning machines based on exponentially strongly mixing sequence. In the end, we compare these bounds with previous results. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
08981221
Volume :
53
Issue :
7
Database :
Academic Search Index
Journal :
Computers & Mathematics with Applications
Publication Type :
Academic Journal
Accession number :
25118960
Full Text :
https://doi.org/10.1016/j.camwa.2006.07.015