In the fifth week of the course, we turn our attention to latent variables.
The lecture begins with a review of transformation matrices and principal components from a theoretic standpoint.
We then see how models that include latent predictors can be estimated and gain an understanding of problems of identification.
Next we fit a bunch of OpenMx models to a simulated data set and see how simple structure can emerge.
Finally, we discuss how we can understand what latent variables might mean. Creating models that are meaningful is the most important task we face. The math and software are just a way to help us with the complexity of the meaning we are after.