Paper: Learning Semantic Correspondences with Less Supervision

ACL ID P09-1011
Title Learning Semantic Correspondences with Less Supervision
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2009
Authors

A central problem in grounded language acqui- sition is learning the correspondences between a rich world state and a stream of text which refer- ences that world state. To deal with the high de- gree of ambiguity present in this setting, we present a generative model that simultaneously segments the text into utterances and maps each utterance to a meaning representation grounded in the world state. We show that our model generalizes across three domains of increasing difficulty—Robocup sportscasting, weather forecasts (a new domain), and NFL recaps.