CIs on mxAlgebra

Attachment | Size |
---|---|
CI.pdf | 144.31 KB |
I have a mxAlgebra of a parameter multiplied by a constant, say new_x = 2*x. When I construct an LBCI on both x and new_x, I expect that the CIs on new_x are the same as the CIs on x multiplied by 2. It turns out that they are not exactly the same (see the "diff" in the following output). When the constants are larger, say 5 or 10, the CIs on the new_x even become NA.
Any ideas why this happens? Thanks.
## Multiplied by 2
## variance Two.two_variance.1.1. variance_x2 diff
## lbound 0.7693044 1.537972 1.538609 -0.0006373491
## estimate 1.0030992 2.006198 2.006198 0.0000000000
## ubound 1.3427832 2.688370 2.685566 0.0028032732
## Multiplied by 5
## variance Five.five_variance.1.1. variance_x5 diff
## lbound 0.7693044 3.840211 3.846522 -0.006311445
## estimate 1.0030992 5.015496 5.015496 0.000000000
## ubound 1.3427832 NA 6.713916 NA
## Multiplied by 10
## variance Ten.ten_variance.1.1. variance_x10 diff
## lbound 0.7693044 NA 7.693044 NA
## estimate 1.0030992 10.03099 10.030992 0
## ubound 1.3427832 NA 13.427832 NA
Best,
Mike
Hi, Mike. I reproduce what
However, by switching to SLSQP as the optimizer at the beginning of the script, I get no confidence limits reported as `NA`, and much smaller differences between the calculated and estimated confidence limits. OpenMx's default behavior is to use an inequality -constrained representation of the confidence-limit optimization problem with SLSQP, but to use a quadratic-penalty representation with NPSOL and CSOLNP. My guess is that multiplying the variance as your script does serves to make the quadratic-penalty representation ill-conditioned.
Log in or register to post comments
Hi, Robert.
Thanks for the comments and suggestions.
I have compared the performance of the three optimizers. Attached is a real problem I have in using OpenMx to conduct a meta-analysis. I am interested in calculating the CI on Tau/(Tau+s2) where Tau is a parameter and s2 is a constant (0.08486598 in this example).
SLSQP does not work well on both the `lbound` and `ubound`.
lbound estimate ubound
Tau2_correct 0.2911837 0.6078023 0.8110563
Tau2_mxCI 0.2762502 0.6078023 0.6748704
CSOLNP works okay in the `ubound` but not the `lbound`.
lbound estimate ubound
Tau2_correct 0.2908848 0.6078023 0.8115252
Tau2_mxCI 0.2748456 0.6078023 0.8112579
NPSOL works similarly as that of CSOLNP.
lbound estimate ubound
Tau2_correct 0.2908847 0.6078023 0.8115252
Tau2_mxCI 0.2748456 0.6078023 0.8112579
If we look at the CIs on the mxAlgebra, it is hard to tell which ones, if any, are the correct CIs. Any suggestions? Thanks.
Best,
Mike
Log in or register to post comments
In reply to Hi, Robert. by Mike Cheung
validation
See the attached script. SLSQP and NPSOL both validate their lower limits. CSOLNP has trouble running the model to validate its lower limit, but I think its lower limit is still trustworthy, since it's approximately the same as the other two optimizers' lower limits. However, SLSQP's upper limit does not validate.
An OpenMx function to automatically attempt to validate confidence limits is a planned feature.
Log in or register to post comments
Thanks, Robert.
It's very helpful.
Mike
Log in or register to post comments
In reply to Thanks, Robert. by Mike Cheung
You're welcome!
Log in or register to post comments
can you post the output of verbose?
can you post the output using `mxSummary(..., verbose = TRUE)`? That will allow abetter look a the diagnostics for us
Log in or register to post comments
Hi Timothy,
Here it is. Thanks.
Mike
Log in or register to post comments
In reply to Hi Timothy, by Mike Cheung
verbose output
Log in or register to post comments
Please see the attached
Log in or register to post comments
In reply to Please see the attached by Mike Cheung
local minimum
I don't really see anything wrong here, in terms of bugs. Gradient-based optimizers cannot always find the global minimum.
Log in or register to post comments
Seems to me that the user
Log in or register to post comments