R2 tells you how much of the variance in your dependent variable is explained by the independent variable, on a 0-100% scale; adjusted R2 tells you how much of the variance in your model is explained adjusted for the number of terms in the model. The issue is that R2 will continue to increase with each additional term regardless of its relationship to the dependent variable. So you'll only be explaining more of the variance without actually improving model fit. Adjusted R2 accounts for this bias by considering degrees of freedom. So, if you add terms to an ANOVA model, i.e., divide the independent variable into smaller and more numerous categories, the adjusted R2 (i.e., explained variance) will only increase if the additions make up for the loss of degrees of freedom. If they do not, then your adjusted R2 will go down, even though unadjusted R2 goes up, which indicates poorer model fit. Some of this explanation has been stolen from here: https://discuss.analyticsvidhya.com/t/what-is-adjusted-r-squared-in-anova/5782
R2 tells you how much of the variance in your dependent variable is explained by the independent variable, on a 0-100% scale; adjusted R2 tells you how much of the variance in your model is explained adjusted for the number of terms in the model. The issue is that R2 will continue to increase with each additional term regardless of its relationship to the dependent variable. So you'll only be explaining more of the variance without actually improving model fit. Adjusted R2 accounts for this bias by considering degrees of freedom. So, if you add terms to an ANOVA model, i.e., divide the independent variable into smaller and more numerous categories, the adjusted R2 (i.e., explained variance) will only increase if the additions make up for the loss of degrees of freedom. If they do not, then your adjusted R2 will go down, even though unadjusted R2 goes up, which indicates poorer model fit. Some of this explanation has been stolen from here: https://discuss.analyticsvidhya.com/t/what-is-adjusted-r-squared-in-anova/5782
First, the difference between R square and R square (adjusted) is that adjusted R-squared is a modified version of R-squared that had adjusted in the model for the number of predictors . The adjusted R-squared increased only if the new term improved the model more than would be expected by chance. It decreases when a predictor improves the model by less than expected by chance. the significant of R square is should be started from 0.5 or 50%, however, this is not high significant, especially if your data is completely linear. You need to know this, each increasing in R square will increase the accuracy of prediction because most the points of data will be very close to the straight line. I recommend you that if your data is non-linear, do not be confident with R square and instead that apply RMSE or RSE because those error equations can deal with the non-linear data and will give you prediction better than R square. If this answer does not help email me