Sequential latent Dirichlet allocation

Lan Du*, Wray Buntine, Huidong Jin, Changyou Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

28 Citations (Scopus)


Understanding how topics within a document evolve over the structure of the document is an interesting and potentially important problem in exploratory and predictive text analytics. In this article, we address this problem by presenting a novel variant of latent Dirichlet allocation (LDA): Sequential LDA (SeqLDA). This variant directly considers the underlying sequential structure, i. e. a document consists of multiple segments (e. g. chapters, paragraphs), each of which is correlated to its antecedent and subsequent segments. Such progressive sequential dependency is captured by using the hierarchical two-parameter Poisson-Dirichlet process (HPDP). We develop an efficient collapsed Gibbs sampling algorithm to sample from the posterior of the SeqLDA based on the HPDP. Our experimental results on patent documents show that by considering the sequential structure within a document, our SeqLDA model has a higher fidelity over LDA in terms of perplexity (a standard measure of dictionary-based compressibility). The SeqLDA model also yields a nicer sequential topic structure than LDA, as we show in experiments on several books such as Melville's 'Moby Dick'.

Original languageEnglish
Pages (from-to)475-503
Number of pages29
JournalKnowledge and Information Systems
Issue number3
Publication statusPublished - Jun 2012


  • Collapsed Gibbs sampler
  • Document structure
  • Latent Dirichlet allocation
  • Poisson-Dirichlet process
  • Topic model


Dive into the research topics of 'Sequential latent Dirichlet allocation'. Together they form a unique fingerprint.

Cite this