Back to Search Start Over

Sensitive Error Correcting Output Codes

Authors :
John Langford
Alina Beygelzimer
Source :
Learning Theory ISBN: 9783540265566, COLT
Publication Year :
2005
Publisher :
Springer Berlin Heidelberg, 2005.

Abstract

We present a reduction from cost-sensitive classification to binary classification based on (a modification of) error correcting output codes. The reduction satisfies the property that e regret for binary classification implies l2-regret of at most 2e for cost estimation. This has several implications: Any regret-minimizing online algorithm for 0/1 loss is (via the reduction) a regret-minimizing online cost-sensitive algorithm. In particular, this means that online learning can be made to work for arbitrary (i.e., totally unstructured) loss functions. The output of the reduction can be thresholded so that e regret for binary classification implies at most 4${\sqrt{\epsilon Z}}$ regret for cost-sensitive classification where Z is the expected sum of costs. For multiclass problems, e binary regret translates into l2-regret of at most 4e in the estimation of class probabilities. For classification, this implies at most 4${\sqrt{\epsilon}}$ multiclass regret.

Details

ISBN :
978-3-540-26556-6
ISBNs :
9783540265566
Database :
OpenAIRE
Journal :
Learning Theory ISBN: 9783540265566, COLT
Accession number :
edsair.doi...........60d92955f4dcbdb61361934a0cc04836