You are here

Training Data

3 posts / 0 new
Last post
jeff.harring's picture
Offline
Joined: 05/11/2012 - 19:20
Training Data

Does Mx allow for training data for mixture models--that is can one specify that the first 10 cases in a data set belong to class number 1 or that the last 15 cases belong to class number 3?

Ryne's picture
Offline
Joined: 07/31/2009 - 15:12
Hi Jeff,

Hi Jeff,

So I'd think of that as a multiple group model, with three groups: two groups with known classes and a third group that is a mixture model. I'm going to outline how it should work, with the caveat that I'm writing psuedo-code so there will probably be areas for improvement and optimization.

First, start with an existing mixture model in OpenMx, which is available in the documentation (though there may be other ways to do that, as omx can handle just about any algebra you throw at it). For the sake of argument, I'm going to assume you have three classes. This is set up as a three-group multiple group model, with each of the three groups containing the full sample. These three submodels are then summed proportional to their mixture weights. When you make your mxData objects for this part of the model, only include the people for whom their class is unknown.

Then, you're going to create a more traditional multiple group model, also available in the documentation. In this model, you'll have up to four submodels (assuming three classes):one each for people you know are in classes 1-3, and a fourth submodel that is your mixture model from before. Each of the mxData objects for the three known classes should contain only those individuals known to that class. Then, constrain parameters to be equal between the known classes and the estimated classes using identical labels.

Here's a pseudocode example:

# data step
data1 <- mxData(harring[knownclass==1, ], "raw")
data2 <- mxData(harring[knownclass==2, ], "raw")
data3 <- mxData(harring[knownclass==3, ], "raw")
dataM <- mxData(harring[knownclass=="guess", ], "raw")
 
# make a template model, which will be simple
template <- mxModel("replaceMe",
  type="RAM",
  manifestVars=c("x", "y", "z"),
  mxPath("x", "y", values=86, labels="replaceThisLabel")
  )
 
# now make the six models we need: three for known classes, three for mixture
known1 <- mxModel(template, data1, name="known1")
known2 <- mxModel(template, data2, name="known2")
known3 <- mxModel(template, data3, name="known3")
mix1 <- mxModel(template, mxFitFunctionML(vector=TRUE), name="class1")
mix2 <- mxModel(template, mxFitFunctionML(vector=TRUE), name="class2")
mix3 <- mxModel(template, mxFitFunctionML(vector=TRUE), name="class3")
 
# change parameter labels
known1 <- omxSetParameters(known1, labels="replaceThisLabel", newlabel="beta1")
mix1     <- omxSetParameters(mix1, labels="replaceThisLabel", newlabel="beta1")
known2 <- omxSetParameters(known1, labels="replaceThisLabel", newlabel="beta2")
mix2     <- omxSetParameters(mix1, labels="replaceThisLabel", newlabel="beta2")
known3 <- omxSetParameters(known1, labels="replaceThisLabel", newlabel="beta3")
mix3     <- omxSetParameters(mix1, labels="replaceThisLabel", newlabel="beta3")
 
# make your mixture model
mixtureForJeff <- mxModel("mixItAllUp",
  dataM, mix1, mix2, mix3,
  mxMatrix(values=1, nrow=1, ncol=3, free=c(FALSE, TRUE, TRUE), name="weights"),
  mxExpectationMixture(paste0("class",1:3), scale="softmax"),
  mxFitFunctionML()
)
 
# now make your multiple group model
bigModel <- mxModel("Everything",
  known1, known2, known3, mixtureForJeff,
  mxFitFunctionMultigroup(c("known1", "known2", "known3", "mixItAllUp"))
)

If that doesn't work, it should at least get you close.

-ryne

jeff.harring's picture
Offline
Joined: 05/11/2012 - 19:20
Follow up on training data

Fantastic Ryne. This helps a lot. Best.

Jeff