You are here

OpenMX status1: 5 -- Problematic in this case?

4 posts / 0 new
Last post
raklein's picture
Offline
Joined: 09/07/2019 - 12:26
OpenMX status1: 5 -- Problematic in this case?
AttachmentSize
File combinedresults1.csv4.93 KB
File combinedresults2.csv4.93 KB
File combinedresults3.csv4.83 KB
Binary Data example.R616 bytes

Hi all,

I'm attaching some data and code to demonstrate my problem, but long story short I'm wondering if there is a fix for the 'OpenMX status1: 5' problem I'm having, AND/OR whether it is OK to report these values as-is? (perhaps with a grain of salt?) We're hoping to submit this paper for publication shortly.

We ran a study where 21 labs attempted to replicate the same social psychology study. However, half of those labs ran a standardized version of the protocol, while the remaining labs were left to their own devices. Sometimes, multiple labs from the same institution participated, so I want to control for that 'nesting' with metasem.

Thanks for any guidance,
Rick

AdminNeale's picture
Offline
Joined: 03/01/2013 - 14:09
METASEM?

This seems like a meta sem question. But overall, code 5s cannot be ignored, no. They can occur when the starting values are light years away from the solution. They can also occur when the data are degenerate (e.g., num variables > num people, or no variance within, or perfect covariance between measures).

Mike Cheung's picture
Offline
Joined: 10/08/2009 - 22:37
The problem of these data

The problem of these data sets is that there are only two effect sizes per cluster. The estimated level 2 variances are not stable enough. I suggest using the single level analysis but mentioning that you have inspected the three-level model.

File attachments: 
raklein's picture
Offline
Joined: 09/07/2019 - 12:26
Great, thanks for the

Great, thanks for the responses. I'll do that and also post the current results as a supplement.