You are here

Very small standard errors from indirectEffect()

10 posts / 0 new
Last post
forscher's picture
Offline
Joined: 03/05/2013 - 09:31
Very small standard errors from indirectEffect()
AttachmentSize
File cor_mats.csv4.6 KB
Binary Data indirectEffect metaSEM forum.R935 bytes

Hi metaSEM users, hi Mike,

I've been attempting to use indirectEffect() to estimate the direct and indirect effects from a series of 3 by 3 correlation matrices. I then want to meta-analyze the resulting direct and indirect effects using either meta() from the metaSEM package or mvmeta() from the mvmeta package in R.

However, I've been running into some (potential) problems doing this. First, meta() doesn't seem to converge on a stable meta-analytic solution when I attempt to find the meta-analytic estimates of the direct and indirect effects returned by indirectEffect. I've tried using rerun() to find good start values and re-fitting the model using those start values, but so far that hasn't helped.

I've also tried fitting the same model in mvmeta(), which, to my knowledge, does arrive at a stable solution (please correct me if I'm wrong). What's strange about the results from mvmeta() is that the standard error of the meta-analytic indirect effect seems, to me, anomalously small (.0020), especially in comparison to the standard error of the direct effect (.0277). However, I'm not sure how to trouble-shoot this problem, or even if this result necessarily indicates a problem.

Would anyone be able to give me some advice? I've attached sample data and R code that reproduces the problems for reference.

Mike Cheung's picture
Offline
Joined: 10/08/2009 - 22:37
The results in mvmeta() are

The results in mvmeta() are problematic because the correlation between the random effects is -1.

The issue seems to be related to the data. If you rescale the effect sizes and their sampling covariance matrices by 10 and 100, respectively, it will have a convergent solution. The results indicate that Tau2_1_1 hits the lower bound 1e-10.
meta(med_effects[, 1:2]10, med_effects[, 3:5]100)

If you constrain the heterogeneity of the indirect effect at 0, it works fine.
meta(med_effects[, 1:2], med_effects[, 3:5], RE.constraints = Diag(c(0, "0.01*Var_dir")))

forscher's picture
Offline
Joined: 03/05/2013 - 09:31
Thanks for the reply! With

Thanks for the reply!

With your suggested solution, I still get a very small standard error for the indirect effect (.0020). I'm also getting an OpenMx error code "6". Does this indicate that I should not trust this analysis? If so, how would you recommend that I troubleshoot further?

Mike Cheung's picture
Offline
Joined: 10/08/2009 - 22:37
It works fine in metaSEM

It works fine in metaSEM 0.9.4 with OpenMx 2.2.6. Could you post your input and output if it does not work?

forscher's picture
Offline
Joined: 03/05/2013 - 09:31
Here's my output and session

Here's my output and session info. I realized that I had not updated R to version 3.2.1, so my version of metaSEM was not up to date. However, after updating both R and metaSEM, I am still getting the OpenMx error code 6 message, even after fixing the heterogeneity of the indirect effect to 0. The estimate for the indirect effect and its standard error on my computer seem to be almost identical to yours, however.

(By the way, in the attached pdf, I am using the same input data that I attached in my first message)

Mike Cheung's picture
Offline
Joined: 10/08/2009 - 22:37
That's strange. It works fine

That's strange. It works fine in both 32 and 64 bits Windows 7 (see the attached files).

I hope others may shed light on this issue.

forscher's picture
Offline
Joined: 03/05/2013 - 09:31
I got it to converge after

I got it to converge after rescaling the effects and variance components by a larger amount (1000 and 1000^2).

I still have a question about the standard error of the indirect effect. In all cases when the model has converged, this standard error has been quite small (~.0020). To me, it seems quite strange to have a standard error that is quite so small, especially when compared to, for example, the standard error of the direct effect from the same model. Is this behavior typical when meta-analyzing indirect effects, or should I be concerned that there are other issues that have not yet been resolved with the model?

Mike Cheung's picture
Offline
Joined: 10/08/2009 - 22:37
I don't think that this is an

I don't think that this is an issue related to the indirect effect.

Since the heterogeneity variance of the indirect effect is practically 0, it is based on a fixed-effects model. The SE is very small because you have a huge sample size (total n=5,543).

The estimated heterogeneity variance of the direct effect is 0.0212. This value contributes to the calculation of the SE and makes the SEs larger than those based on a fixed-effects model.

forscher's picture
Offline
Joined: 03/05/2013 - 09:31
I see. So as the

I see. So as the heterogeneity of the indirect effect approaches 0, the indirect effect estimate and standard error approaches those based on fixed effects. And I expect the reason the estimate itself has so little heterogeneity is that a large number of the study-specific indirect effects are very close to 0.

This makes sense. Thank you for your help!

forscher's picture
Offline
Joined: 03/05/2013 - 09:31
I get the same error when

I get the same error when multiplying the direct and indirect effect estimates by 10 and the variances/covariances by 100. I get convergence if I multiply both the estimates and the variances/covariances by 100 (which I know is inappropriate, but still might be informative).

I'm using the CSOLNP optimizer that comes with OpenMx 2.2.6.