Back to Search Start Over

Structured learning with constrained conditional models.

Authors :
Chang, Ming-Wei
Ratinov, Lev
Roth, Dan
Source :
Machine Learning; Sep2012, Vol. 88 Issue 3, p399-431, 33p
Publication Year :
2012

Abstract

Making complex decisions in real world problems often involves assigning values to sets of interdependent variables where an expressive dependency structure among these can influence, or even dictate, what assignments are possible. Commonly used models typically ignore expressive dependencies since the traditional way of incorporating non-local dependencies is inefficient and hence leads to expensive training and inference. The contribution of this paper is two-fold. First, this paper presents Constrained Conditional Models (CCMs), a framework that augments linear models with declarative constraints as a way to support decisions in an expressive output space while maintaining modularity and tractability of training. The paper develops, analyzes and compares novel algorithms for CCMs based on Hidden Markov Models and Structured Perceptron. The proposed CCM framework is also compared to task-tailored models, such as semi-CRFs. Second, we propose CoDL, a constraint-driven learning algorithm, which makes use of constraints to guide semi-supervised learning. We provide theoretical justification for CoDL along with empirical results which show the advantage of using declarative constraints in the context of semi-supervised training of probabilistic models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08856125
Volume :
88
Issue :
3
Database :
Complementary Index
Journal :
Machine Learning
Publication Type :
Academic Journal
Accession number :
77439762
Full Text :
https://doi.org/10.1007/s10994-012-5296-5