Paper: Representational Bias In Unsupervised Learning Of Syllable Structure

ACL ID W05-0615
Title Representational Bias In Unsupervised Learning Of Syllable Structure
Venue International Conference on Computational Natural Language Learning
Session Main Conference
Year 2005
Authors

Unsupervised learning algorithms based on Expectation Maximization (EM) are often straightforward to implement and provably converge on a local likelihood maximum. However, these algorithms of- ten do not perform well in practice. Com- mon wisdom holds that they yield poor results because they are overly sensitive to initial parameter values and easily get stuck in local (but not global) maxima. We present a series of experiments indi- cating that for the task of learning sylla- ble structure, the initial parameter weights are not crucial. Rather, it is the choice of model class itself that makes the differ- ence between successful and unsuccess- ful learning. We use a language-universal rule-based algorithm to find a good set of parameters, and then train the parameter weights using EM. W...