My regression analysis produced a perfect model after using Ten independent variables on a dependent variable. Could it be that the variables are too many or there is a specific technique that I lack to go around it?
Did you check that the multiple regression assumptions apply for your model? Check VIF for multicollinearity, test normality on error term and heteroscedastic. Also the model is theoretically correct and the independent variables are supported by relevant literature.
Compare R2 and adjusted R2 of your regression model, if the adjusted R2 of your model is much smaller in comparison to the R2 your model is an overfitted one.
In addition, look into the model assumption validity diagonstics as mentioned by prof. Ntanos using appropriate graphical/analytical techniques.
Are you talking about OLS linear regression? What is the sample size? Are the 10 explanatory variables all quantitative, or do you have a mix of quantitative & categorical? And by what criterion is the model fitting perfectly? I think everyone assumes you mean that R2 = 1, but you've not actually said that. Thanks for clarifying.
Agreeing with Bruce Weaver, you can always get a perfect fit if your number of predictor variables approaches your number of observation, but you have explained nothing. Think of 2 observations and a straight line fit with an intercept and slope - perfect fit!
Thanks to you all for your rapid responses. I Have gained from your wealth of knowledge.
I realized that regression is better when data points are huge. What I did was to split the data points to make it numerous and it did not only show the level of contribution but data was normally distributed.
NO PERFECT MODEL. There is no such a thing as a "perfect" model. All models has an error term. A model by definition is a predictive function. It is an estimate; all estimates have error. Retrace your steps and rerun the data, model, and testing.
OVER-FITTING may be a possible problem if you have 10 explanatory factors. Check you information criterion index. To avoid over-fitting problem, for each additional unnecessary variable added, there must be an appropriate penalty point to deduct.
INFERENTIAL ERROR. If something appears to be too good to be true, it may be "just that."
Type 1 error = insisting that you are correct.
Type 2 error = not knowing what to conclude even in light of evidence supporting your own position.
Type 3 error = rejecting null hypothesis for the wrong reason.
Type 4 error = correctly reject the null hypothesis but making error in interpreting the result.