Constraint Not Honored

Posted on
Picture of user. rabil Joined: 01/14/2010
I have a simple two-factor measurement error model. There is a constraint on the variance of one of the latent factors that it be positive. It easily finds a solution using mxTryHard but the variance estimate is always negative. What is the point of using a constraint if it is not honored? The negative estimate then causes problems when I try to bootstrap confidence intervals. It complains about quantile missing values and NaNs and won't produce a result unless I remove the variance from the confidence interval list.
Replied on Thu, 12/01/2022 - 10:08
Picture of user. AdminRobK Joined: 01/24/2014

Could you share your syntax? Also, which optimizer are you using?
Replied on Tue, 12/06/2022 - 11:14
Picture of user. rabil Joined: 01/14/2010

In reply to by AdminRobK

I'm using SLSQP. I have no way to transfer the code - the computer system has no connection to the Internet for security reasons. There is nothing secret about the model itself. There are two latent variables. An arrow points from u1 to u2. The path from u1 to u2 is estimated. u1 has just one indicator with the path to the indicator constrained to 1. The residual variance of the indicator is constrained to zero. u2 has 5 indicators, all the paths are constrained to 1 and the indicator residual variances are all constrained to be the same. There is a constraint that the variance of u2 > 0. Yet for some data sets it is estimated to be slightly negative. This causes problems when bootstrapping to try to get a confidence interval. Although even for some datasets where the variance is estimated to be positive, there can sill be problems using the bootstrap. Sorry I cannot show the code.
Replied on Tue, 12/06/2022 - 12:31
Picture of user. mhunter Joined: 07/31/2009

Just to reinforce Rob's point, OpenMx treats constraints (i.e., things that use `mxConstraint()`) very differently from bounds (i.e., things that use `lbound=` and `ubound=` inside an `mxPath()` or `mxMatrix()`). Constraints are *much* harder optimization problems than bounds. I recommend using bounds instead of contraints whenever possible.

Similarly, many constraints are nonlinear equality constraints. In this case, it is often much easier for optimization if you reformulate these nonlinear equality constraints with `mxAlgebra()`s and set the elements in a matrix or on a path to the result of these algebras.


mod <- mxModel(
...,
mxAlgebra(sin(a)*exp(b) + c, name='peter'),
mxPath(from='Paul', to='Pay', labels='peter[1,1]') # set the path from Paul to Pay to the [1,1] element of peter
)

Replied on Tue, 12/06/2022 - 14:06
Picture of user. rabil Joined: 01/14/2010

Thanks! I had completely forgotten about the existence of lower and upper bounds. I will use them.
Replied on Tue, 12/06/2022 - 14:24
Picture of user. rabil Joined: 01/14/2010

I just tried adding a lower bound. I removed the mxConstraint. It doesn't appear to converge after 100 tries in mxTryHard. My experience is that if it doesn't find a solution with 100 tries it won't even with 1000 tries.
Replied on Tue, 12/06/2022 - 14:34
Picture of user. rabil Joined: 01/14/2010

If I remove the constraint and don't include an lbound, the variance estimate is more negative than if I include the constraint. So the constraint does have an effect.
Replied on Tue, 12/06/2022 - 16:30
Picture of user. AdminRobK Joined: 01/24/2014

Two general suggestions...

First, try reducing the mxOption "Feasibility tolerance" to something smaller than its on-load value. For details, see the help page for `mxOption()`.

Second, try switching to CSOLNP. CSOLNP is an "interior-point" algorithm; that is, if it is initialized at a point inside the feasible space, then it will never reach the boundary of the feasible space.

Replied on Fri, 12/09/2022 - 12:49
Picture of user. mhunter Joined: 07/31/2009

The OpenMx dev team discussed this at our meeting today. Here's a bit of a summary:

1. It is odd that the solution is not honoring the `mxConstraint()`s.

2. There's no mathematical reason why this model should not work.

3. Could the problem be the data? If some of the variables are too strongly correlated, it could produce this problem. You could inspect the variance inflation factors by looking at the inverse of the correlation matrix of your data. You can get the VIFs from the diagonal elements of the inverse of the correlation matrix of your data.

4. See below for an example of what we think your models is. Is this right?


require(OpenMx)

lvar <- c('u1', 'u2')
mvar <- paste0('x', 1:6)

mod <- mxModel('rabil',
type='RAM',
manifestVars=mvar,
latentVars=lvar,
mxPath('u1', 'x1', values=1, free=FALSE),
mxPath('one', 'u1', values=-1, labels='meaU1'),
mxPath('u1', arrows=2, values=.4, labels='varU1'),
mxPath('u2', arrows=2, values=1.5, labels='varU2'),
mxPath('u2', mvar[-1], values=1, free=FALSE),
mxPath(mvar[-1], arrows=2, values=.2, labels='varE'),
mxPath('one', 'u2', values=5, labels='meaU2'),
mxPath('u1', 'u2', values=.8, labels='reg')
)

set.seed(11)
ds <- mxGenerateData(mod, nrows=2000)

modd <- mxModel(mod, mxData(ds, 'raw'))

moddr <- mxRun(modd)
summary(moddr)
Summary of rabil

free parameters:
name matrix row col Estimate Std.Error A
1 reg A u2 u1 0.7573228 0.044966956
2 varE S x2 x2 0.1972722 0.003119146
3 varU1 S u1 u1 0.3746096 0.011846192
4 varU2 S u2 u2 1.4755404 0.047912693
5 meaU1 M 1 u1 -0.9973239 0.013685980
6 meaU2 M 1 u2 4.9878243 0.052618578

Model Statistics:
| Parameters | Degrees of Freedom | Fit (-2lnL units)
Model: 6 11994 23155.12
Saturated: 27 11973 NA
Independence: 12 11988 NA
Number of observations/statistics: 2000/12000

Information Criteria:
| df Penalty | Parameters Penalty | Sample-Size Adjusted
AIC: -832.8845 23167.12 23167.16
BIC: -68010.1086 23200.72 23181.66
To get additional fit indices, see help(mxRefModels)
timestamp: 2022-12-09 12:35:55
Wall clock time: 0.149719 secs
optimizer: SLSQP
OpenMx version number: 2.20.7.39
Need help? See help(mxSummary)