You are here

OpenMx for just optimization

6 posts / 0 new
Last post
dallyride's picture
Offline
Joined: 02/03/2013 - 13:48
OpenMx for just optimization

I've been trying to determine if OpenMx can be used to help with my optimization problem.

I'm very impressed with what I've read and am excited about the prospect of using it for my project. However, I'm not yet convinced it will help me do what I need and whether it is the right package for this work.

I'm working on confirming a physiological model for heart rate kinetics during cycling. The model takes heart rate, cadence and power as input data. The model calculates a derived power using heart rate, cadence and a number of physiological values that may be measured (those in pink boxes) and those that are adjusted (in yellow boxes) so as to minimize standard deviation of residuals between derived power and measured power. Here is a link to a screenshot of my spreadsheet:

http://db.tt/3xIh2wuk

The derivation uses things like slope calculations in the input data, various time constants and there is also recursion. Roughly speaking, the derivation moves from left to right to eventually derive a power value and the residuals are calculated with the measured power value, using a moving average. By making successive parameter changes from a set of starting values, the model can be converged to local minimums. By changing starting points, I can then look for the global minimum. I never know whether I'm at the global minimum however often the correlation is sufficiently high that I can conclude that the parameters are very near their optimal values - hence a close representation of the actual physiological values. (Interestingly, once there has been sufficient data collection to have tested the model, i.e. compare the derived physiological parameters against real parameters from a broader variety of individuals and situations, the model itself can then be evaluated and potentially adjusted if needed. I can see this following a more traditional statistical approach.)

So my first and most important question is whether OpenMx can be used to solve for the optimal parameter values. I'm not sure if algebras can be setup to obtain data longitudinally for calculations (like slopes) for example and whether objectives can be setup to perform global minimum searches. If so, then my question would be whether it is well-suited for this (hoping that it is but suspect that it isn't). Finally, if it is well-suited, if there are any posts and/or sources of information that I can look at that will help me.

Btw, if anyone is interested in understanding more and would like to discuss further, feel free to email me off list.

Ryne's picture
Offline
Joined: 07/31/2009 - 15:12
Yes, it can! Your two options

Yes, it can!

Your two options are the algebra objective and R objective, with the algebra objective being a far better option. Algebra objectives just minimize the value of some user-supplied algebra. Here's the simplest possible version of such a model:

test <- mxModel("algDemo",
mxMatrix("Full", 1, 1, TRUE, 10, "a", name="param"),
mxAlgebra(a^2, name="criterion"),
mxAlgebraObjective("criterion")
)

We made a free parameter ("a", in matrix "param") and picked a value of a where a^2 was at its minimum (obviously zero). Check ?mxAlgebra to see if the current supported algebra functions can generate your model. mxRowObjective is the row-wise extension to mxAlgebraObjective: you specify an algebra for each data row, then specify a second algebra to reduce the vector of row objectives to a single model objective for minimization.

Alternatively, you could use mxRObjective to minimize a function you write in R. However, this is much much much slower than the algebra objective, because OpenMx has to make repeated calls back and forth between R and C, and this takes time.

Let us know if either of these options are feasible for your work.

ryne

dallyride's picture
Offline
Joined: 02/03/2013 - 13:48
Great to hear

Ryne,

As someone new to OpenMx, R, and this type of analysis, I very much appreciate you providing the guidance. Now it's just a matter of working my way up the (steep) learning curve to understand how best to set up the script to work most effectively and efficiently.

I guess what I'll need to work through is to find a way define a series of mxAlgebras to funnel into mxAlgebraObjectives that essentially minimize a st. dev. calculation or if mxRowObjective can be used, it will minimize each individual row residual from a set of algebras and parameters. (Perhaps best to use nested models?) Alternatively, I've been working through how a single mxAlgebra could be used. This creates a lot of complexity in definition but might be the only way.

One challenge in the optimization that I think will be difficult is to work through is the phase offset and slope calculation. That is, the input data vectors are not in phase (mostly from poor synchronization of the measurement devices) and the number of rows to use for the slope depends on the accuracy of the measurement device as well as a calculation from the previous row. Three parameters are used to determine the optimal offset and the number of rows to use in the slope calculation.

Other challenges might also be algebras that calculate running totals which are used in other algebras. These too are parameterized and will also need to be optimized.

Again, thanks and any pointers to material I can read through or samples along these lines would be very much appreciated.

dallyride's picture
Offline
Joined: 02/03/2013 - 13:48
Step one is done and working!

Thought I'd post my script and results. Doesn't look like anything out there. With a little trial-and-error, managed to get it to work. Here is the script:

ridefile <- read.delim("~/Documents/OpenMX Sample/ridefile.csv")
pecmodel = mxModel("PEC",
mxMatrix("Full",nrow(ridefile),1,FALSE,4,labels=c("powerconst"),name="Powerconst"),
mxMatrix("Full",nrow(ridefile),1,FALSE,as.matrix(ridefile["Exertion.Data"]),name="exertiondata"),
mxMatrix("Full",nrow(ridefile),1,FALSE,as.matrix(ridefile["Power.Data"]),name="powerdata"),
mxMatrix("Full",nrow(ridefile),1,FALSE,as.matrix(ridefile["Cad.Data"]),name="caddata"),
mxMatrix("Full",nrow(ridefile),1,FALSE,186,name="HRmax"),
mxMatrix("Full",nrow(ridefile),1,TRUE,210,lbound=c(150),ubound=c(350),labels=c("vmax"),name="Vmax"),
mxMatrix("Full",nrow(ridefile),1,TRUE,100,lbound=c(100),ubound=c(420),labels=c("pmax"),name="Pmax"),
mxMatrix("Full",nrow(ridefile),1,FALSE,58,lbound=c(40),ubound=c(72),labels=c("hrmin"),name="HRmin"),

mxAlgebra(Vmax - caddata, name="operand0"),
mxAlgebra(Powerconst * Pmax * caddata, name="operand1"),
mxAlgebra(exertiondata-HRmin, name="operand2"),
mxAlgebra(Vmax*Vmax, name="operand3"),
mxAlgebra(HRmax-HRmin, name="operand4"),
mxAlgebra(operand0*operand1*operand2/operand3/operand4, name="derivedpowerdata"),
mxAlgebra(abs(derivedpowerdata-powerdata),name="residuals"),

mxAlgebra(sum(residuals),name="powerresidual"),

mxAlgebraObjective("powerresidual")

)

optimized = mxRun(pecmodel)
summary(optimized)

Running PEC
Warning message:
In model 'PEC' NPSOL returned a non-zero status code 6. The model does not satisfy the first-order optimality conditions to the required accuracy, and no improved point for the merit function could be found during the final linesearch (Mx status RED)
> summary(optimized)
The model does not satisfy the first-order optimality conditions to the required accuracy, and no improved point for the merit function could be found during the final linesearch (Mx status RED)

free parameters:
name matrix row col Estimate Std.Error lbound ubound
1 vmax Vmax 1 1 249.4819 0.2272638 150 350
2 pmax Pmax 1 1 336.2352 0.1872338 100 420

observed statistics: 0
estimated parameters: 2
degrees of freedom: -2
-2 log likelihood: 39340.33
saturated -2 log likelihood: NA
number of observations: 0
chi-square: NA
p: NA
Information Criteria:
df Penalty Parameters Penalty Sample-Size Adjusted
AIC: NA NA NA
BIC: NA NA NA
CFI: NA
TLI: NA
RMSEA: NA
timestamp: 2013-02-08 18:48:10
frontend time: 18.42052 secs
backend time: 0.03847694 secs
independent submodels time: 2.598763e-05 secs
wall clock time: 18.45902 secs
cpu time: 18.45902 secs
openmx version number: 1.3.2-2301

I'm only using an hours worth of data. I would like to be able to analyze an entire file with 2 or more hours worth, but the frontend time increases substantially. As well, this is less than half of the analysis so once I start piling on the remaining algebras, the time to compute might make this approach unfeasible.

Are there any other approaches I could use that would provide better analysis time? Is my script efficient designed?

Thanks for your feedback.

dallyride's picture
Offline
Joined: 02/03/2013 - 13:48
End of the rope

So at this stage, I think I'm at the end of the rope.

In order to complete the optimization process, I'm going to need to calculate a running total or accumulation value down several matrices. There is no way to do that with mxAlgebra as far as I can tell. Functions like "cumsum" are not available. (I guess I could pull out some of my old coding skills and update the source but that'd be pushing it for me.)

The software is very easy to use I was able to get part way through the optimization without much fuss. But I don't think it is meant to address the type of analysis I'm looking to do.

neale's picture
Offline
Joined: 07/31/2009 - 15:14
Matrix algebra operations can sum

If I wanted to sum elements in a column of a matrix, I would premultiply it by a row vector of 1's. Or if I wanted just some of the elements summed, put a zero in elements I didn't want. I've not read your needs carefully so I don't know if this helps, but hope it does.