You are here

Log likelihood concave?

3 posts / 0 new
Last post
cgoldammer's picture
Offline
Joined: 04/07/2010 - 19:07
Log likelihood concave?

Hi,

in a structural equation model in RAM notation, one can easily write out the log-likelihood, as done by Bollen (p.107). My question: is the log-likelihood concave? In other words, do I need to worry about finding only local minima when mxRun tells me that my model converged? I'd be happy about any background reading on this.

Thanks and regards, Chris

neale's picture
Offline
Joined: 07/31/2009 - 15:14
Chris I'm not sure. In

Chris

I'm not sure. In general I would say that it is not, because often there are invariances with respect to the sign of parameters (flip all the signs on factor loadings and the model fits equally well). Since a model in which a factor loading has an estimated value of zero may fit worse than one in which it is either -.7 or +.7, then the surface is not concave. In multivariate space, it is more complicated, so the multidimensional surface may not concave with respect to all parameters simultaneously. Worse, the likelihood may not be evaluated at all for some values of the parameters, e.g., when the predicted covariance matrix is not positive definite. Thus the function becomes discontinuous in such regions, which I take to be non-convex.

Bottom line - always be careful with optimization; make sure that the results agree with inspection of the properties of the data. Does it make reasonable predictions about the means & covariances? Do nested models fit no better than less restricted models? Remember too that SEMs may be underidentified, either globally or with certain datasets (empirical underidentification) and therefore assuming that the solution found is the global solution is probably not wise. When in doubt, simulate data based on the model using known parameter values, and see if optimization recovers them regularly and precisely from a variety of initial starting values.

cgoldammer's picture
Offline
Joined: 04/07/2010 - 19:07
Thanks, those are excellent

Thanks, those are excellent points!