I have three dependent variables, and 10 predictors and I am analyzing the data with multivariate regression. However, I need to compare the model and the contribution of each predictor with another groups. Any ideas how to proceed?
You just make one data set and identify the origin of every observation by adding the group as a factor variable. Then for every predictor you add an interaction effect by group. The estimates tell you per predictor what the difference is. I have written an extensive chapter on interaction effects in my book (see chapter 5.5):
Most SEM programs are more than capable of testing equality of regression models across groups. As well, if the model-data fit is poor for a test of strict equality, you can easily evaluate whether the misfit is a function of something simple (e.g., intercepts differ across groups) or more complex (e.g., some of the regression coefficients differ across groups for specific predictors).
Martin Schmettow is correct in recommending that you assess interaction effects, where the different groups are some times known as a "moderating variable." If you stay with regression, you will need to run 3 models, one for each of your dependent variables, and add the interaction effect into each of those models.
Alternatively, if you use the more complicated approach through Structural Equation Modeling (SEM), you will be able to include all three dependent variables in the same model.
Hi Ana. As the two Davids* have suggested, you could use SEM software. But I don't think SEM software is required. I think you can accomplish the same thing with a procedure that estimates multilevel models. See this old SPSSX-L post for an example and some discussion.
p.s. - When I read that you have 10 predictors, I wondered whether your sample size was large enough to support that many variables. For a good overview of over-fitting, see Mike Babyak's nice article.
The interaction approach already suggested by everybody here is the way to go for comparing models. As far as looking at the importance of variables, I would suggest trying lasso(also known as elastic net). Programs are available in R. I have added some references. Best wishes, David Booth PS lasso works to prevent overfitting.