Paper: Entropy Rate Constancy In Text

ACL ID P02-1026
Title Entropy Rate Constancy In Text
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2002

We present a constancy rate princi- ple governing language generation. We show that this principle implies that lo- cal measures of entropy (ignoring con- text) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di erent ways. We also show that this e ect has both lexical (which words are used) and non-lexical (how the words are used) causes.