Attachment | Size |
---|---|
Multi.IQ_.Ht_.NTF_.Model-mck4.R [6] | 17.16 KB |
Hi all,
This probably won't be easy to answer given how vague the question is, but...
We're trying to run a multivariate 'nuclear twin family' model with lots of parameters to be estimated (~30 free parameters in the simplified version we're trying to run now, where lots of parameters we otherwise would estimate are not free) and lots of non-linear constraints (~10 in the current model). The problem is that the script has been running for 1.5 hours so far, and still going. I will post the script as an attachment in case anyone wants to glance it over for anything obvious, but barring that, here's my question:
Anyone have any ideas on where we can start in trying to make this run faster? What types of issues might make it run this slowly? I've thought of entering in covariance matrices rather than raw data. And I've also thought of changing it so it just runs a univariate model to see if it can run that. Also, thought of trying to use some R parallelization packages that optimize low-level functions, such as "pnmath0". I don't think that the package "snow" will speed anything up here given that it's a single model we're trying to run. Might any of these ideas affect the speed? What other tricks and/or coding alterations (see attached) might people suggest?
Thanks,
Matt