You are here

A couple of questions about GMMs

3 posts / 0 new
Last post
Veronica_echo's picture
Offline
Joined: 02/23/2018 - 01:57
A couple of questions about GMMs

Hi all,

I am conducting the literature review and would like to fit GMMs using OpenMx. I have several questions. Firstly, after looking into R scripts from previous topics related to GMMs model, I think the first step for the mixture model is to estimate mixing components and parameters of each component. Then at the second step, we calculate posterior probabilities for each observation. Is it right? If so, may I know the algorithm to get the FIML? I mean almost all publications I read employed EM algorithm to obtain parameters and posterior probabilities simultaneously; if OpenMx estimates them separately, does it also utilize EM or another algorithm?

Secondly, some articles mentioned "switching labels" issue for latent class analysis. Though as my understanding, OpenMx does not have this issue by allowing us to construct a labeled submodel for each latent class, I am still wondering if any available approach can be used to prove it. Any advice would be appreciated.

Thanks in advance!

jpritikin's picture
Offline
Joined: 05/24/2012 - 00:35
label switching

I'll take your second question.

All mixture models suffer from the label switching problem. It doesn't matter which software you use to estimate the model. I think it would help to define what label switching is. Suppose you have two mixture components A and B, both univariate normals with SD=1. Suppose the data are generated by a mixture of means 0 and 8. When estimating the model, A might get assigned to represent mean 8 and then B gets mean 0 or maybe A is assigned to 0 and B gets mean 8. The model is equivalent either way. That's the label switching problem. One way to identify a unique solution is to add a constraint like A.mean < B.mean. That way A would always get assigned to 0 and B to 8.

OpenMx users are usually only interested in a best solution. Therefore, the label switching problem is typically not much of a problem because we don't hesitate to select a single best solution. However, in MCMC approaches, label switching is catastrophic and makes it very difficult to assess convergence to the posterior. Here's some discussion, http://staffblogs.le.ac.uk/bayeswithstata/2014/05/18/label-switching/

Veronica_echo's picture
Offline
Joined: 02/23/2018 - 01:57
Thanks

Thanks for your kind reply. I think the advice of adding some constrains in the model is really useful!