Back to Search Start Over

On the difficulty of approximately maximizing agreements

Authors :
Ben-David, Shai
Eiron, Nadav
Long, Philip M.
Source :
Journal of Computer & System Sciences. May2003, Vol. 66 Issue 3, p496. 19p.
Publication Year :
2003

Abstract

We address the computational complexity of learning in the agnostic framework. For a variety of common concept classes we prove that, unless <f>P=NP</f>, there is no polynomial time approximation scheme for finding a member in the class that approximately maximizes the agreement with a given training sample. In particular our results apply to the classes of monomials, axis-aligned hyper-rectangles, closed balls and monotone monomials. For each of these classes, we prove the NP-hardness of approximating maximal agreement to within some fixed constant (independent of the sample size and of the dimensionality of the sample space). For the class of half-spaces, we prove that, for any <f>ϵ>0</f>, it is NP-hard to approximately maximize agreements to within a factor of <f>(418/415−ϵ)</f>, improving on the best previously known constant for this problem, and using a simpler proof. An interesting feature of our proofs is that, for each of the classes we discuss, we find patterns of training examples that, while being hard for approximating agreement within that concept class, allow efficient agreement maximization within other concept classes. These results bring up a new aspect of the model selection problem—they imply that the choice of hypothesis class for agnostic learning from among those considered in this paper can drastically effect the computational complexity of the learning process. [Copyright &y& Elsevier]

Subjects

Subjects :
*MACHINE learning
*MACHINE theory

Details

Language :
English
ISSN :
00220000
Volume :
66
Issue :
3
Database :
Academic Search Index
Journal :
Journal of Computer & System Sciences
Publication Type :
Academic Journal
Accession number :
9853558
Full Text :
https://doi.org/10.1016/S0022-0000(03)00038-2