Basically, I found that for simple linear regression needs to pass six assumptions and for multiple regression it needs to pass eight assumptions before analyzing the data.
All are same except multicollinearity......... Since u got a single variable in a simple model, no need to bother.... Its not the case with multiple variables..............
The assumption of 'lack of multicollinearity' in the predictors is not relevant in simple linear regression. Regarding your concern about 6 (in simple regression) vs 8 (in multiple regression) assumptions, you have to mention those 2 assumptions which are missing in simple regression.
Theory on Regression Analysis has been developed based on certain assumptions like error should follow normal distribution with mean zero & variance sigma square… errors should be free from autocorrelation (for time series data) and heteroscedasticity (for cross sectional data) etc. etc. However, when you work with real data there are chances that those basic assumptions got violated. There might be problems of multicolinearity, autocorrelation and heteroscedasticity etc. There are diagnostic criteria available, which would detect these violations, if any, and accordingly one has to apply rectification measure(s to get rid of those violations. Let me know if you need further info
The assumptions for simple linear regression and multiple regression are the same. The four basic assumptions are: L.I.N.E. : Linearity, Independence (of the error term), Normality and Equal variance(the conditional distribution error term has a mean zero, equal variance normal distribution), as you can see from most basic stats books.
You can check this link: http://en.wikipedia.org/wiki/Ordinary_least_squares#Classical_linear_regression_model
The additional assumption, no perfect multicollinearity: Qxx = E[ xi*xi′ ] is a positive-definite matrix, is important for the "Existence" of the estimator.