Bayesian models are usually learned using batch algorithms that have to iterate multiple times over the full dataset. This is both computationally expensive and, from a cognitive point of view, highly implausible. We present a novel online algorithm for the word segmentation models of Goldwater et al. (2009) which is, to our knowledge, the first published version of a Particle Filter for this kind of model. Also, in contrast to other proposed algorithms, it comes with a theoretical guarantee of optimality if the number of particles goes to infinity. While this is, of course, a theoretical point, a first experimental evaluation of our algorithm shows that, as predicted, its performance improves with the use of more particles, and that it performs competitively with other online learners proposed in Pearl et al. (2011).
|Number of pages||9|
|Journal||Proceedings of the Australasian Language Technology Association Workshop 2011|
|Publication status||Published - 2011|
|Event||Australasian Language Technology Workshop (9th : 2011) - Canberra|
Duration: 1 Dec 2011 → 2 Dec 2011