Concise Integer Linear Programming Formulations for Dependency Parsing: Difference between revisions
From HLT@INESC-ID
No edit summary |
No edit summary |
||
Line 5: | Line 5: | ||
|email=afm+@cs.cmu.edu | |email=afm+@cs.cmu.edu | ||
|www=http://www.cs.cmu.edu/~afm/ | |www=http://www.cs.cmu.edu/~afm/ | ||
|bio=}} | |bio= André Martins is a PhD student at the Language Technologies Institute within the School of Computer Science at Carnegie Mellon.}} | ||
== Date == | == Date == |
Revision as of 14:11, 7 January 2010
André Martins |
André Martins is a PhD student at the Language Technologies Institute within the School of Computer Science at Carnegie Mellon. |
Addresses: www mail |
Date
- 15:00, Friday, January 8th, 2010
- Room 336
Speaker
- André Martins, Carnegie Mellon University, USA
Abstract
We formulate the problem of non-projective dependency parsing as a polynomial-sized integer linear program. Our formulation is able to handle non-local output features in an efficient manner; not only is it compatible with prior knowledge encoded as hard constraints, it can also learn soft constraints from data. In particular, our model is able to learn correlations among neighboring arcs (siblings and grandparents), word valency, and tendencies toward nearly-projective parses.
The model parameters are learned in a max-margin framework by employing a linear programming relaxation. We evaluate the performance of our parser on data in several natural languages, achieving improvements over existing state-of-the-art methods.