You can deal with multicollinearity in SEM by creating relationships between them (e.g. correlation or causation) or using a latent variable to eliminate spurious relationship.
You can deal with multicollinearity in SEM by creating relationships between them (e.g. correlation or causation) or using a latent variable to eliminate spurious relationship.
Another way of dealing with collinearity is by combining the variables. For example, if you have 5 variables and 2 of then are correlated in such a way that they are measuring the same thing then combine those 2. It will leave you with three variables. This works well for latent variables.
I would like to note that colinearity increases the standard errors of parameter estimates but leaves the consistency and unbiasedness of these parameter estimates untouched.
Hence, what ever you to consider if the method may reduce colinearity but at the same time biasing and destroying the main theoretical models.
Approaches like PCA regression, PLS regression or combining variables lead to a changed causal model that isn't the theoretical model any more that the researcher once intended. In my research I firstly strive for a causal model that reflects my thinking of the field and I try to defend that model as long as i can against any data manipulation strategies or statistical approaches that change the model.
Collinearity is a consequence of the (very common) violation of one of the most fundamental assumptions underlying regression--that the X or predictor values are chosen, not merely sampled or observed like the Y or dependent values. If you could overcome the design flaw that left you with Collinearity / sampled X values in the first place, that would address the problem much better than an ex post adjustment. The after-the-fact fix is trying to answer an impossible question--what would your data be like if the research design was different than what it actually was?