Paper: Effective Self-Training For Parsing

ACL ID N06-1020
Title Effective Self-Training For Parsing
Venue Human Language Technologies
Session Main Conference
Year 2006
Authors

We present a simple, but surprisingly ef- fective, method of self-training a two- phase parser-reranker system using read- ily available unlabeled data. We show that this type of bootstrapping is possible for parsing when the bootstrapped parses are processed by a discriminative reranker. Our improved model achieves an f-score of 92.1%, an absolute 1.1% improvement (12% error reduction) over the previous best result for Wall Street Journal parsing. Finally, we provide some analysis to bet- ter understand the phenomenon.