adjusted r square 98% and sse 0.07 which means that the model is valid and perfect but why i am having high positive correlation 91% of a variable and at the same time the sign of the coefficient is -0.08 in the regression model ??
Have you analyzed the correlation between the other dependent variables? Some times this type of results can be attributed to the presence of multicollinearity in your model (a high correlation between independent variables).
Yes, collinearity can cause this. Perhaps you could try some alternative models, and compare their performances. R-square is not what you want for this. You could try using variables that the subject matter suggests, and try to avoid those that give you basically the same information. (Principle components might help, but then interpretation is more vague.) Graphical residual analysis can be useful, and holding out some test data can help avoid overfitting.
The estimated variance of the prediction error is useful, for individual cases and for totals for finite populations. Because sigma has to be estimated, any bias in a model will influence the estimate of sigma, and this will impact the estimated variance of the prediction error. Thus it can become a good overall indicator of accuracy.
Yes, this is very likely because of multicollinearity. Try to estimate VIF (variance inflation factor) for each variable in your data. There are a lot of ways in R or Python to do that.
Shortly - big VIF indicates that this variable heavily correlates to some other variable.
Then simply drop the variable which has the biggest VIF and try to build the model without it. Maybe the model would be more plausible. If you still don't like it, you can repeat the procedure.