Paper: Modeling Prompt Adherence in Student Essays

ACL ID P14-1144
Title Modeling Prompt Adherence in Student Essays
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2014

Recently, researchers have begun explor- ing methods of scoring student essays with respect to particular dimensions of qual- ity such as coherence, technical errors, and prompt adherence. The work on modeling prompt adherence, however, has been focused mainly on whether individ- ual sentences adhere to the prompt. We present a new annotated corpus of essay- level prompt adherence scores and pro- pose a feature-rich approach to scoring es- says along the prompt adherence dimen- sion. Our approach significantly outper- forms a knowledge-lean baseline prompt adherence scoring system yielding im- provements of up to 16.6%.