You are here

3-class Mixture Model : Help to find Starting Values for each class!!

4 posts / 0 new
Last post
Aivil's picture
Offline
Joined: 03/14/2019 - 09:44
3-class Mixture Model : Help to find Starting Values for each class!!

Hello,

I am currently try to fit a 3-class mixture model on longitudinal data. I'm having a hard time understanding how I should set the starting values for each class.

As far as I understood from Chapter 7 of "Growth Modeling" (Grimm, Ram & Estabrook, 2017), p.150, starting values should be different for each class.

I have tried using the mxTryHard() function to set the best fitting starting values, but it seems there is no difference in the outputs with or without that function.

I attach the code, in which you can see that I have erroneously set the starting values at 0 for each class.

Any help is deeply appreciated, I am definitely stuck!!
Thanks!!!!!

class1 <- mxModel('Class1', type='RAM',
                  manifestVars=c('health_2','health_3','health_4','health_5','health_6','health_7','health_8','health_9'),
                  latentVars=c('eta_1','eta_2'),
                  # residual variances
                  mxPath(from=c('health_2','health_3','health_4','health_5','health_6','health_7','health_8','health_9'), arrows=2,
                         free=TRUE, values=1, labels='theta'),
                  # latent variances and covariance
                  mxPath(from=c('eta_1','eta_2'), arrows=2, connect='unique.pairs',
                         free=TRUE, values=c(1, 0.5, 1), labels=c('psi_11', 'psi_12', 'psi_22')),
                  # intercept loadings
                  mxPath(from='eta_1', to=c('health_2','health_3','health_4','health_5','health_6','health_7','health_8','health_9'), arrows=1,
                         free=FALSE, values=1),
                  # slope loadings
                  mxPath(from='eta_2', to=c('health_2','health_3','health_4','health_5','health_6','health_7','health_8','health_9'), arrows=1,
                         free=FALSE, values=c(0, 1,2,3,4,5,6,7)),
                  # latent variable means
                  mxPath(from='one', to=c('eta_1', 'eta_2'), arrows=1,
                         free=TRUE, values=c(0,0), labels=c('alpha_1_c1', 'alpha_2_c1')),
                  # ML Fit Function to run for each individual
                  mxFitFunctionML(vector=TRUE)
) # close model for class 1
 
class2 <- mxModel(class1,
                  # latent means
                  mxPath(from='one', to=c('eta_1', 'eta_2'), arrows=1,
                         free=TRUE, values=c(0,0), labels=c('alpha_1_c2', 'alpha_2_c2')),
                  name='Class2') # close model
 
class3 <- mxModel(class1,
                  # latent means
                  mxPath(from='one', to=c('eta_1', 'eta_2'), arrows=1,
                         free=TRUE, values=c(0,0), labels=c('alpha_1_c3', 'alpha_2_c3')),
                  name='Class3') # close model
 
 
classRP <- mxMatrix('Full', 3, 1, free=c(TRUE, FALSE, FALSE),
                    values=1, lbound=0.001, labels=c('p1', 'p2', 'p3'), name='RProps')
classP <- mxAlgebra(RProps %x% (1 / sum(RProps)), name='Props')
algObj <- mxAlgebra(-2* sum(log(Props[1,1] %x% Class1.objective + Props[2,1] %x% Class2.objective + Props[3,1] %x% Class3.objective)),
                    name='mixtureObj')
obj <- mxFitFunctionAlgebra('mixtureObj')
gmm.3.means <- mxModel('3Class Means Growth Mixture Model',
                       mxData(observed=dfwide_OpenMx, type='raw'),
                       class1, class2, class3, classRP, classP, algObj, obj)
 
gmm.3.means = mxModel(gmm.3.means, mxCI(c("psi_11", "psi_12", "psi_22", "alpha_1_c2", "alpha_2_c2", "theta", "alpha_1_c1", "alpha_2_c1", 'alpha_1_c3', 'alpha_2_c3'))) # list the things you want CIs for.
gmm.3.means = mxRun(gmm.3.means, intervals= T)
summary(gmm.3.means)
 
 
model <- mxTryHard(gmm.3.means) # Run the model, returning the result into model
summary(model)
AdminRobK's picture
Offline
Joined: 01/24/2014 - 12:15
output?
I have tried using the mxTryHard() function to set the best fitting starting values, but it seems there is no difference in the outputs with or without that function.

So, what are the outputs you're getting?

In any event, I don't think mxTryHard() can help much if you're giving the same start values to all 3 groups.

Aivil's picture
Offline
Joined: 03/14/2019 - 09:44
Output attached

Thanks for you reply.
You can find the output attached.

I also noticed that right after running the model, there's this line:

> summary(model)
Summary of 2Class Means Growth Mixture Model

But i have specified the model as a 3-class one!! how is it possible?

Thanks!!

File attachments: 
AdminRobK's picture
Offline
Joined: 01/24/2014 - 12:15
What were you expecting

What were you expecting mxTryHard() to do? It makes multiple attempts to run the model you give it, while randomly perturbing the start values between attempts, until it finds an acceptable solution. If I'm reading your output correctly, you're getting the same result with and without mxTryHard(), meaning that it found an acceptable solution starting with your input start values (i.e. with alpha_1_c1 and alpha_2_c1 initialized at zero for all 3 components). Because local minima can be a problem with mixture models, it's probably wise to pass argument exhaustive=TRUE to mxTryHard(). In that case, mxTryHard() will use all of its extra attempts, and return to you the best solution it found.

You may want to consider simulated annealing instead of the default optimizer.

The approach you're using is a little old-fashioned. Yes, I know you're following the User Guide, but the User Guide is out-of-date. You might have an easier time with things if you used mxExpectationMixture().

I also noticed that right after running the model, there's this line:

> summary(model)
Summary of 2Class Means Growth Mixture Model

But i have specified the model as a 3-class one!! how is it possible?

That line merely prints the name of the MxModel object. It only reflects the character string in the MxModel's 'name' slot