Back to Search
Start Over
Cost-Sensitive Active Visual Category Learning
- Source :
- International Journal of Computer Vision. 91:24-44
- Publication Year :
- 2010
- Publisher :
- Springer Science and Business Media LLC, 2010.
-
Abstract
- We present an active learning framework that predicts the tradeoff between the effort and information gain associated with a candidate image annotation, thereby ranking unlabeled and partially labeled images according to their expected "net worth" to an object recognition system. We develop a multi-label multiple-instance approach that accommodates realistic images containing multiple objects and allows the category-learner to strategically choose what annotations it receives from a mixture of strong and weak labels. Since the annotation cost can vary depending on an image's complexity, we show how to improve the active selection by directly predicting the time required to segment an unlabeled image. Our approach accounts for the fact that the optimal use of manual effort may call for a combination of labels at multiple levels of granularity, as well as accurate prediction of manual effort. As a result, it is possible to learn more accurate category models with a lower total expenditure of annotation effort. Given a small initial pool of labeled data, the proposed method actively improves the category models with minimal manual intervention.
- Subjects :
- Active learning (machine learning)
Computer science
business.industry
Cognitive neuroscience of visual object recognition
Pattern recognition
Semi-supervised learning
Machine learning
computer.software_genre
Annotation
Ranking
Artificial Intelligence
Concept learning
Pattern recognition (psychology)
Selection (linguistics)
Computer Vision and Pattern Recognition
Artificial intelligence
business
computer
Software
Subjects
Details
- ISSN :
- 15731405 and 09205691
- Volume :
- 91
- Database :
- OpenAIRE
- Journal :
- International Journal of Computer Vision
- Accession number :
- edsair.doi...........2c5370700632b5dc13847b2327f846ae
- Full Text :
- https://doi.org/10.1007/s11263-010-0372-4