Paper: Readability Annotation: Replacing the Expert by the Crowd

ACL ID W11-1415
Title Readability Annotation: Replacing the Expert by the Crowd
Venue Innovative Use of NLP for Building Educational Applications
Session
Year 2011
Authors

This paper investigates two strategies for collecting readability assessments, an Ex- pert Readers application intended to collect fine-grained readability assessments from lan- guage experts and a Sort by Readability ap- plication designed to be intuitive and open for everyone having internet access. We show that the data sets resulting from both annota- tion strategies are very similar. We conclude that crowdsourcing is a viable alternative to the opinions of language experts for readabil- ity prediction.