I have 4 independent variables within a multiple regression model. I am trying to figure out the percentage of variance each individual independent variable contributes to the model using SPSS. How do I do this?
If all the IVs/predictor variables are completely independent of one another, each contributes exactly r^2 to the multiple R-squared (where r^2 is the squared zero-order correlation of that IV with the DV).
If there is any degree of collinearity among the IVs, then you have to be specific as to what you want. If you want each IV's individual explanatory power for the DV, that's r^2 (as above).
If you want to know, how much does a specific IV add to the explanatory power of a model, given the other IVs included, then you can: (a) find the R-squared for the full model; then (b) find the R-squared for the full model without the target IV, and subtract the second value from the first. Repeat for all IVs. I believe this process has been referred to as communality analysis.
If you want to know the relative emphasis given to each IV in a regression model, you can compare the absolute values of the standardized regression coefficients (sometimes called "beta"s).
It is not clear to me what you are specifically interested in. Maybe David's approach is suiteable for you. If your are interested in the unique variance explained by each predictor, when all predictors are in the model. If "yes", you can call the semipartial correlation in SPSS in the regression dialogue statistics --> part and partial correlation. The squared semipartial correlation may answer your question.
Are you performing a one-way ANOVA? There is the assumption of equal variances between groups. In regression, it is refered to as heteroscedasticity. In Welch’s ANOVA, you don’t need to satisfy that assumption.
R-Square is the proportion of the variance explained by the independent variables, hence can be computed by SSRegression / SSTotal.