Paper: Automated Essay Scoring by Maximizing Human-Machine Agreement

ACL ID D13-1180
Title Automated Essay Scoring by Maximizing Human-Machine Agreement
Venue Conference on Empirical Methods in Natural Language Processing
Session Main Conference
Year 2013
Authors

Previous approaches for automated essay scoring (AES) learn a rating model by min- imizing either the classification, regression, or pairwise classification loss, depending on the learning algorithm used. In this paper, we argue that the current AES systems can be further improved by taking into account the agreement between human and machine raters. To this end, we propose a rank- based approach that utilizes listwise learn- ing to rank algorithms for learning a rating model, where the agreement between the hu- man and machine raters is directly incorpo- rated into the loss function. Various linguistic and statistical features are utilized to facilitate the learning algorithms. Experiments on the publicly available English essay dataset, Au- tomated Student Assessment Prize (ASAP), show t...