ACL Anthology Network (All About NLP) (beta) The Association Of Computational Linguistics Anthology Network |
ACL ID | W08-1209 |
---|---|
Title | An Agreement Measure for Determining Inter-Annotator Reliability of Human Judgements on Affective Text |
Venue | Coling 2008: Proceedings of the 2nd workshop on Information Retrieval for Question Answering |
Session | |
Year | 2008 |
Authors |
An affective text may be judged to be- long to multiple affect categories as it may evoke different affects with varying degree of intensity. For affect classification of text, it is often required to annotate text corpus with affect categories. This task is often performed by a number of hu- man judges. This paper presents a new agreement measure inspired by Kappa co- efficient to compute inter-annotator relia- bility when the annotators have freedom to categorize a text into more than one class. The extended reliability coefficient has been applied to measure the quality of an affective text corpus. An analysis of the factors that influence corpus quality has been provided.