You are here

Another CI problem w/ ordinal ACE model

7 posts / 0 new
Last post
chancey's picture
Offline
Joined: 09/09/2017 - 13:10
Another CI problem w/ ordinal ACE model

Hi all,

It appears my 95% CI problems for a series of ordinal ACE models I'm running is not solved after all. Specifically, while CIs for my AE and CE submodels appear to be fine, the ACE CIs return an error (shown in bold below).

confidence intervals:
lbound estimate ubound note
oneACEo.VC[1,1] NA 0.59633060 NA !!!
oneACEo.VC[1,2] 1.000507e-08 0.02077049 NA !!!
oneACEo.VC[1,3] NA 0.38289892 0.5185679 !!!

CI details:
diagnostic statusCode
1 alpha level not reached infeasible start
2 active box constraint infeasible start
3 success infeasible start
4 active box constraint infeasible start
5 active box constraint infeasible start
6 success infeasible start

I should mention I am using this model, which I think is appropriate given these data. I am very new to the OpenMX environment and am struggling with these models. Any advice would be very much appreciated.

AdminRobK's picture
Offline
Joined: 01/24/2014 - 12:15
eliminate lower bounds on ACE paths

The first thing I think you should try is to get rid of the lower bounds on the ACE path coefficients. You can do that either by simply not using argument lbound to mxMatrix() when you create the objects with R symbols 'pathA', 'pathC', and 'pathE', or by appropriately using omxSetParameters() on your assembled MxModel object.

You could also try using a different optimizer. In your case, I'd suggest trying SLSQP. To switch to SLSQP, run this,

mxOption(NULL,"Default optimizer","SLSQP")

, before you run mxRun().

I you're using OpenMx v2.7.11 or newer, you could get bootstrap confidence intervals instead. See the help page for mxBootstrap(). The default coverage probability of bootstrap CIs is 50%. If you increase it. to 95% (via argument boot.quantile to summary()) definitely increase the number of replications to at least 1000.

chancey's picture
Offline
Joined: 09/09/2017 - 13:10
Thanks much for your reply,

Thanks much for your reply, Rob. I tried each of your suggestions. First, I attempted eliminating lbounds on path coefficients and/or using the optimizer you suggested. This fills in some gaps in CIs for the ACE model:

confidence intervals:

lbound estimate ubound note
4.762400e-31 0.59633234 0.7230270
NA 0.02076865 0.5926181 !!!
2.769719e-01 0.38289902 0.5170699

To investigate missing CIs, run summary() again, with verbose=T, to see CI details.

The bootstrapped CIs seem off and when I use mxBootstrap() command I get the following error:
Warning message:Only 5% of the bootstrap replications converged. Accuracy is much less than the 1000 replications requested

AdminRobK's picture
Offline
Joined: 01/24/2014 - 12:15
a few more suggestions
First, I attempted eliminating lbounds on path coefficients and/or using the optimizer you suggested. This fills in some gaps in CIs for the ACE model:

I'm curious to see the details, from summary() with argument verbose=T. I'm guessing that the lower limit for the shared-environmental component is also approximately zero, as it is for the additive-genetic component.

How large is your sample, anyhow? And how often does the least-common level of the ordinal variable occur the sample?

The bootstrapped CIs seem off and when I use mxBootstrap() command I get the following error:
Warning message:Only 5% of the bootstrap replications converged. Accuracy is much less than the 1000 replications requested

Hmm. Since you're analyzing an ordinal variable, you might need to tell OpenMx that status Red is OK for bootstrap replications, with

mxOption(NULL,"Status OK",c("OK","OK/green","not convex/red","nonzero gradient/red"))

BTW, if you decide to go with bootstrap CIs instead of profile-likelihood CIs, CSOLNP or NPSOL might be a better choice of optimizer than SLSQP. If you want, it's possible to use one optimizer to find point estimates, and a different one to find profile-likelihood CIs, for example, as in the syntax in one of my previous posts.

chancey's picture
Offline
Joined: 09/09/2017 - 13:10
Tiny tiny sample,

Tiny tiny sample,
n=334/2=167 pairs. 29 DZ and 138 MZ pairs. Original data collection effort.

Here is verbose output (forgive the lack of formatting):

CI details:
c11 e11 method diagnostic statusCode
1 7.450660e-01 0.6669908 neale-miller-1997 success OK
2 -2.959168e-10 0.5262822 neale-miller-1997 success OK
3 -1.084544e-15 0.6253159 neale-miller-1997 alpha level not reached infeasible non-linear constraint
4 7.698171e-01 0.6382561 neale-miller-1997 success infeasible non-linear constraint
5 2.761231e-05 0.5262812 neale-miller-1997 success infeasible non-linear constraint
6 2.189199e-01 0.7190757 neale-miller-1997 success infeasible non-linear constraint
Tiny tiny sample, n=334/2=167 pairs. 29 DZ and 138 MZ pairs. Original data collection effort.

AdminRobK's picture
Offline
Joined: 01/24/2014 - 12:15
Lower limit for C is zero

OK, then yes, I guess you can just report the lower limit for the C variance component as zero. The point estimate was near-zero anyhow, and since the variance component is the square of a path coefficient, it can't go any less than zero.

Was this with SLSQP, BTW?

chancey's picture
Offline
Joined: 09/09/2017 - 13:10
Yes, with SLSQP.

Thanks again, Rob--I appreciate it.

Log in or register to post comments