## Abstract

Parameters of a finite mixture model are often estimated by the expectation–maximization (EM) algorithm where the observed data log-likelihood function is maximized. This paper proposes an alternative approach for fitting finite mixture models. Our method, called the iterative Monte Carlo classification (IMCC), is also an iterative fitting procedure. Within each iteration, it first estimates the membership probabilities for each data point, namely the conditional probability of a data point belonging to a particular mixing component given that the data point value is obtained, it then classifies each data point into a component distribution using the estimated conditional probabilities and the Monte Carlo method. It finally updates the parameters of each component distribution based on the classified data. Simulation studies were conducted to compare IMCC with some other algorithms for fitting mixture normal, and mixture *t*, densities.

Original language | English |
---|---|

Pages (from-to) | 6684-6693 |

Number of pages | 10 |

Journal | Communications in Statistics - Theory and Methods |

Volume | 46 |

Issue number | 13 |

DOIs | |

Publication status | Published - 3 Jul 2017 |

## Keywords

- complete data log-likelihood function
- EM algorithm
- finite mixture models
- IMCC algorithm
- membership probability