12. Learning Serial Constraint-based Grammars
Robert Staubs [+]
This paper proposes a method for learning grammars in the general framework of Harmonic Serialism, a variant of Optimality Theory with gradual derivations. It addresses problems of structural ambiguity introduced by derivations, and is able to deal with variable as well as categorical data. Maximum Entropy serial grammars generate an unbounded number of possible derivations. By formalizing the derivational space in terms of Markov chains, probabilities over those derivations can be calculated, allowing the use of standard optimization methods for fitting the grammar model to the learning data. Learning results are provided for simplified versions of an opaque stress-epenthesis interaction, and of a variable vowel deletion process modeled after French schwa.