Paper: Shift-Reduce CCG Parsing with a Dependency Model

ACL ID P14-1021
Title Shift-Reduce CCG Parsing with a Dependency Model
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2014
Authors

This paper presents the first dependency model for a shift-reduce CCG parser. Mod- elling dependencies is desirable for a num- ber of reasons, including handling the ?spurious? ambiguity of CCG; fitting well with the theory of CCG; and optimizing for structures which are evaluated at test time. We develop a novel training tech- nique using a dependency oracle, in which all derivations are hidden. A challenge arises from the fact that the oracle needs to keep track of exponentially many gold- standard derivations, which is solved by integrating a packed parse forest with the beam-search decoder. Standard CCGBank tests show the model achieves up to 1.05 labeled F-score improvements over three existing, competitive CCG parsing models.