Attachment | Size |
---|---|
Lavaan twin modelling[1].R | 1.51 KB |
Hi Mike,
Hope this finds you well. I'm trying to run a multi-group SEM (twin data) using summary statistics rather than raw data. I have the covariance and asymptotic covariance matrices. I think I'm hitting errors in the way I'm handling the multi-group aspect - as WLS says the following:
Error in withCallingHandlers(expr, warning = function(w) invokeRestart("muffleWarning")) :
(list) object cannot be coerced to type 'double'
I've supplied the two matrices as lists - one for each group. The script is attached.
I also get this message:
In addition: Warning message:
In if (is.pd(Cov)) { :
the condition has length > 1 and only the first element will be used
Any help much appreciated as always!
Best,
Pasco
Dear Pasco,
Attached is the code for your model.
You may note that the SEs in lavaan are different than those in OpenMx as the precision of the inputs is different in these two cases.
Best,
Mike
Thanks so much Mike - that's fantastic. I was quite far off in my attempt here!
All best,
Pasco
Hi Mike,
I have a follow-up query about this - the model runs perfectly with your edits, and returns sensible looking parameter estimates and standard errors. However, when I add a request for confidence intervals I get very strange estimates. I'm not sure if my code is wrong, or there is a more fundamental issue with the likelihood-based intervals. The adjusted part of the code was just:
ace <- mxModel('All', gp1, gp2,
mxFitFunctionMultigroup(c('gp1.fitfunction', 'gp2.fitfunction')), mxCI(c('gp1.Amatrix'), interval = 0.95, type=c('both')))
fit1 <- mxRun(ace, intervals = TRUE)
Thanks for any help, as always!
Best,
Pasco
Dear Pasco,
The parameter estimates with the LBCI are the same as those without the LBCI.
The model without the LBCI:
The model with the LBCI:
Could you tell us what the issues are?
Best,
Mike
Hi Mike,
Yes the estimates are fine, but it's the CIs - they run almost from -1 to 1 (for a).
Best,
Pasco
I bet some of your free parameters are sign-indeterminate.
Ahhh, of course - thanks, that's likely it!