Hi there,

I want to test for equal means and variances across twins and zygosities in a trivariate model, separately for the measures.

In the trivariate script I am using, http://ibg.colorado.edu/cdrom2014/bartels/Multivariate/Trivariate.R , the means and variances are tested together and I am not sure how to separate them.

Can I use my results from univariate saturated sub models or is there a way to test for it in a trivariate model?

Many thanks!

I assume you're talking about these lines

Constrain only means

Constrain only variances

Yes, I did that as well but I need to test the means and variances separately for each measure, here we are testing means for all measures together and then the variances, right?

Okay, I see now. Then just change the one mean or variance.

Replace

with

Thank you Michael.

The script that I am actually using is different, it does not contain the labFull function and my variances are not =1.0 (see attachment). When I've tried to apply that rule here:

eqMeTwinModel <- mxModel( Saturated_Model, name="eqM&Vtwins" )

eqMeTwinModel <- omxSetParameters( eqMeTwinModel, labels=c("MZM1_1","MZM1_2"),

free=TRUE, values=svMe[1], newlabels=c("MZM1") )

eqMeTwinModel <- omxSetParameters( eqMeTwinModel, labels=c("DZM1_1","DZM1_2"),

free=TRUE, values=svMe[1], newlabels=c("DZM1") )

eqMeTwinModel <- omxSetParameters( eqMeTwinModel, labels=c("MZF1_1","MZF1_2"),

free=TRUE, values=svMe[1], newlabels=c("MZF1") )

eqMeTwinModel <- omxSetParameters( eqMeTwinModel, labels=c("DZF1_1","DZF1_2"),

free=TRUE, values= svMe[1], newlabels=c("DZF1") )

I get an error:

Expected covariance matrix is not positive-definite

Is that a problem with my labels or am I doing something wrong?

Many thanks!

Izza

I think there is a problem with the specification which leads to difficulty during optimization. The way you have set up the model uses symmetric matrices. Although convenient, there is nothing to prevent the matrix from becoming non-positive definite (which might occur with correlations >1 and which can be detected with an eigenvalue decomposition:

So, although all the eigenvalues are positive here, they may not be during optimization. You may be able to spot the problem if you use mxRun(model, unsafe=T) which will still return estimates even if an error occurs. Those values could be fed into eigen() to identify any zero or negative eigenvalues. While possibly useful, this information will not help you correct the problem. One possibility might be to start with diagonal matrices for the predicted covariance matrices - this is more strongly positive definite and may help steer clear of infeasible regions. The alternative, more guaranteed to avoid this error, approach would be to use a Cholesky decomposition with the diagonal elements bounded to be greater than zero. To do this, the matrices should be type Lower, and an algebra should compute, e.g., L %*% t(L) to estimate the covariance matrix. Doing so does, however, make it difficult to, e.g., equate twin 1 and twin 2 covariances.

Thank you for your reply Michael!

I have tried the option with diagonal matrices for the predicted covariance matrices and the error is gone. However my ch2 values for sub models are not as I would expect based on my data and univariate analyses. For equal means they are very large and for the variances they are small; I expect means to be equal and not the variances.

I've attached new script and the data file.

Izza

I think the analyses are correct, and that the means really are that different. Possibly, however, you did not intend the equalities you have imposed. With the equated model the expected means look like this:

With the not equated model they look like this:

The parameter specifications have equated the first two means, which were substantially different in the not-equated model (e.g, 540.03 vs. 16.25)