You are here

Modification indexes/indices

13 posts / 0 new
Last post
mdewey's picture
Offline
Joined: 01/21/2011 - 13:24
Modification indexes/indices
AttachmentSize
Binary Data mi.r380 bytes
Binary Data sorbom.r2.16 KB
Plain text icon sorbom.txt2.72 KB

I am starting a new thread on this but see
https://openmx.ssri.psu.edu/thread/831
for background.

I am still attempting to replicate the results from Sorbom's paper using the code posted by Rine and some of the new options to mxRun outlined in the old thread. I am attaching three files.
mi.r contains my code for modification indexes
sorbom.r creates the correlation matrix from one of the examples in Sorbom's paper and runs it
sorbom.txt is the resultant output showing the value of mi I get,
the values of the gradient, and the principal diagonal of the Hessian. It also has my session info.

Since I am not quite sure what combination of Major iterations and useOptimizer to use I have done it for several choices. As you can see I am getting zero values for mi. A number of possibilities occur to me in non-increasing order of my prior probability (a) I have failed to implement Rine's code properly (b) I have made some other error in specification (c) the changes made to the optimizer do not support this sort of calculation.

Even more worrying is that if you re-run sorbom.r several times in the same session you do not get the same results. I can understand that it OpenMx often needs to start from random values but that does not seem appropriate here although i may have misunderstood.

And a final query about degrees of freedom. I get the same chi-sqaure for the model without additional paths as Sorbom but the degrees of freedom differ by 7 which is the size of the correlation matrix (OpenMx 6, Sorbom 13)

mspiegel's picture
Offline
Joined: 07/31/2009 - 15:24
Some comments and hopefully

Some comments and hopefully other developers will chime in. OpenMx does not start the optimization process from random values. It starts from the starting values specified in the free parameters, using the 'values=' arguments to MxMatrix or MxPath. Also, I see that fit0, fit1, fit2 all have identical free parameter estimates. Which is to be expected as optimizer is not to be invoked at all, because useOptimizer=FALSE is specified in the call to mxRun(). I see that the gradient values are different for fit0, fit1, fit2, and I don't know whether that is a correct or incorrect.

tbates's picture
Offline
Joined: 07/31/2009 - 14:25
Getting robust modification indices out of OpenMx

Modification indices have several uses, including detecting misspecification [1]

OpenMx doesn't do this. We had a helper contributed from John Fox's SEM package, but it fails on many models (non positive definite). This is also very slow.

Lavaan appears to compute mi and expected change for free (they are just a slot in the fitted model)...

That means it is pretty easy to build a robust MI function (as in http://cran.r-project.org/web/packages/semTools/semTools.pdf).

Is there any way we can get the information out of OpenMx which Lavaan exposes?

This thread implies that OpenMx is not computing something necessary for this, but perhaps it is? Any help much appreciated.

[1]Saris, W. E., Satorra, A., & van der Veld, W. M. (2009). Testing structural equation models or detection of misspecifications? Structural Equation Modeling, 16, 561-582.

neale's picture
Offline
Joined: 07/31/2009 - 15:14
Ok for RAM models

I can see this working for type=RAM or type=LISREL, for which the first & second derivatives are known algebraically. However, in general we don't have a matrix algebra derivative solver, so when users specify their models with mxAlgebra() calls we won't be able to get MI's. Well, not until we get software that does matrix calculus symbolically patched into OpenMx - which would be a delight to see.

Ryne's picture
Offline
Joined: 07/31/2009 - 15:12
Agreed. It goes beyond the

Agreed. It goes beyond the lack of algebraic gradients and hessians for non-RAM/LISREL models. RAM and LISREL models have a finite set of possible parameters that could be added for MI. Absent those constrained models, there are infinite candidate parameters that one could add.

tbates's picture
Offline
Joined: 07/31/2009 - 14:25
for RAM is a big win

hi, just to say that robustly handling this for RAM would be a big win for many users: I'd say the RAM user base is about 10x the arbitrary matrix user base...

MI based on gradients also seems to discover better modifications than does drop1() type modifications

neale's picture
Offline
Joined: 07/31/2009 - 15:14
Maybe

I'm not sure about current user base, but of modelers generally that may be true. We have a lot of matrix users in twin research (100x the mxPath() ones, I'd say).

What for me would be a big win would be being able to compute MI's subject to equality constraints between matrix elements. In an arbitrary algebra context this is easy to arrange. In RAM it isn't. So in a behavior genetics context, there's not much point in looking at the MI for an element which is the path for additive genetic effects for twin 1, without that path being constrained to equal the corresponding path for twin 2 (and the same across multiple groups). Sorting out the mathematics for this situation would be step 1...

mculbert's picture
Offline
Joined: 07/19/2011 - 17:06
Gradiant not calculated

Hello,

I think part of the problem is that the gradient is not calculated when useOptimizer=F.

The attached test fits the independence model for the Sorbom Table 4/Fig 1 example; then, it reconstructs the model with starting parameters as the estimated values from the previous fit. This second model is run with useOptimizer=F, but the returned gradients are not the same.

If I'm reading npsolWrap.c correctly, this appears to be due to the fact that objectiveFunction() doesn't actually calculate the gradient, leaving the calculation (by numerical difference) to npsol (which, of course, is never called when useOptimizer=F). Can someone more familiar with the code confirm this?

Ideally, I imagine computation of the derivatives should be added to objectiveFunction(). In the mean time, is there an easy way to evaluate the objective for an arbitrary set of parameters from R? If so, perhaps the gradient could be approximated with R's numericDeriv().

Thanks,
Michael Culbertson

mculbert's picture
Offline
Joined: 07/19/2011 - 17:06
New mi function

I did some more mucking around in npsolWrap.c and I noticed that the gradient is calculated in the same routine that provides calculatedHessian. I tried having OpenMx spit out the calculated gradient as well. Unfortunately, it seems that these values aren't all that accurate, at least when compared with the gradient and Hessian provided by SAS PROC CALIS. (And they yield the wrong MI for the Sorbom Table 5a example.)

On the up side, I was able to adapt the routine from John Fox's sem R package, which computes the RAM gradient/Hessian analytically. Fox's routine wasn't quite right (as is noted in the documentation), but I think I've corrected it so that it agrees with CALIS (and the Sorbom Table 5a example). The function takes a RAM-type MxModel (with cov/cor data) and returns a vector modification indices and the corresponding estimated change in the parameter.

The attached file illustrates the agreement with Sorbom Table 5a.

Michael Culbertson

tbates's picture
Offline
Joined: 07/31/2009 - 14:25
updates?

Any news on doing this in OpenMx?

I notice too that there are edge cases where the MI function returns an error using the Fox method.

mdewey's picture
Offline
Joined: 01/21/2011 - 13:24
Current position

Just in case someone comes on this in the archive I should summarise

a - we have a working solution for modification indexes (thanks to Michael Culbertson)
b - the whole issue of gradient calculation seems to still be under discussion as thread
http://openmx.psyc.virginia.edu/thread/1035
suggests.

Thanks to everyone for their efforts on this and just as a PS can anyone tell me why OpenMx and Sorbom differ in reporting the degrees of freedom? See the last paragraph of my original post.

JWiley's picture
Offline
Joined: 03/25/2011 - 07:53
Updates?

Just curious if there are plans from the development team to include modification indices in OpenMx anytime soon (say ~ 3 - 9 month time window)? I am working on some models where I want to make a lot of use of mod. ind. and while what is here is nice, I want something a little different. If there are no plans to implement in core OpenMx soon, I will go ahead and write some functions to extend what is here, but I am not going to spend the time to making a hack if the real thing is about to come :)

Ryne's picture
Offline
Joined: 07/31/2009 - 15:12
There are no immediate plans

There are no immediate plans for modification indices in the main OpenMx release. You can view our weekly developer's meeting notes, but our current focuses include multilevel modeling, parallelism and general speed-ups. Please feel free to use the information in this thread to help you build an MI implementation. One of our long-term goals is for more features to be provided by the user community.

In the past, we've gotten stuck looking for ways to get an estimate of the Hessian without moving the parameter values. To date, we haven't happened upon a solution.