At developers meeting on 8/12 we discussed the following:
- The group expressed a desire to either (1) get several testers to use the joint ordinal and continuous integration in full information maximum likelihood in OpenMx 1.1 or (2) release OpenMx 1.1 to the user base and handle any issues with joint ordinal and continuous integration as they present themselves. Mike Neale described a modification to his polychoric script that would exercise a subset of the joint ordinal and continuous integration functionality. While this is far from a complete test but it will offer more assurance about than the implementation than we currently have.
- Dan Hackett continues to explore an implementation of FIML for PPML. Based on group discussions and Dan's exploration moving FIML to the back end of OpenMx appears to be a promising possibility.
- Analytical solutions for the gradient and hessian to improve the efficiency of FIML were discussed. Currently, OpenMx uses the Newton-Raphson  optimization method which requires both the gradient and hessian. Mike Hunter is testing the convergence of expressions for the gradient he has derived. Analytical expressions for the hessian still need to be explored.
- The group also discussed the possibility of implementing the Fletcher-Powell or BFGS  optimization method which only requires the gradient to be supplied. The group believes that the BFGS method is less general (not applicable to non-linear constraints) than Newton-Raphson but can provide some speedup for problems where it is applicable. The linked wikiepedia page for BFGS is believed to contain an error, the error free documentation describing the method is here.