I have a simple two-factor measurement error model. There is a constraint on the variance of one of the latent factors that it be positive. It easily finds a solution using mxTryHard but the variance estimate is always negative. What is the point of using a constraint if it is not honored? The negative estimate then causes problems when I try to bootstrap confidence intervals. It complains about quantile missing values and NaNs and won't produce a result unless I remove the variance from the confidence interval list.

Could you share your syntax? Also, which optimizer are you using?

I'm using SLSQP. I have no way to transfer the code - the computer system has no connection to the Internet for security reasons. There is nothing secret about the model itself. There are two latent variables. An arrow points from u1 to u2. The path from u1 to u2 is estimated. u1 has just one indicator with the path to the indicator constrained to 1. The residual variance of the indicator is constrained to zero. u2 has 5 indicators, all the paths are constrained to 1 and the indicator residual variances are all constrained to be the same. There is a constraint that the variance of u2 > 0. Yet for some data sets it is estimated to be slightly negative. This causes problems when bootstrapping to try to get a confidence interval. Although even for some datasets where the variance is estimated to be positive, there can sill be problems using the bootstrap. Sorry I cannot show the code.

Are you using an MxConstraint to enforce a positive variance for u2? If u2 is a free parameter, you can simply apply an

`lbound`

to it.Just to reinforce Rob's point, OpenMx treats constraints (i.e., things that use

`mxConstraint()`

) very differently from bounds (i.e., things that use`lbound=`

and`ubound=`

inside an`mxPath()`

or`mxMatrix()`

). Constraints aremuchharder optimization problems than bounds. I recommend using bounds instead of contraints whenever possible.Similarly, many constraints are nonlinear equality constraints. In this case, it is often much easier for optimization if you reformulate these nonlinear equality constraints with

`mxAlgebra()`

s and set the elements in a matrix or on a path to the result of these algebras.Thanks! I had completely forgotten about the existence of lower and upper bounds. I will use them.

Yes, to be clear I was using an mxConstraint.

I just tried adding a lower bound. I removed the mxConstraint. It doesn't appear to converge after 100 tries in mxTryHard. My experience is that if it doesn't find a solution with 100 tries it won't even with 1000 tries.

If I remove the constraint and don't include an lbound, the variance estimate is more negative than if I include the constraint. So the constraint does have an effect.

Two general suggestions...

First, try reducing the mxOption "Feasibility tolerance" to something smaller than its on-load value. For details, see the help page for

`mxOption()`

.Second, try switching to CSOLNP. CSOLNP is an "interior-point" algorithm; that is, if it is initialized at a point inside the feasible space, then it will never reach the boundary of the feasible space.

The OpenMx dev team discussed this at our meeting today. Here's a bit of a summary:

It is odd that the solution is not honoring the

`mxConstraint()`

s.There's no mathematical reason why this model should not work.

Could the problem be the data? If some of the variables are too strongly correlated, it could produce this problem. You could inspect the variance inflation factors by looking at the inverse of the correlation matrix of your data. You can get the VIFs from the diagonal elements of the inverse of the correlation matrix of your data.

See below for an example of what we think your models is. Is this right?