Be it a statistical one or generated by some other technique (like symbolic regression of genetic programming); are there any universal and objective measures/criteria (some sort of test etc.) that can identify a model as an over-fitting one?
In general, the over-fitting problem in prediction can be checked by validation. If the developed model is giving good prediction for training data sets and biased prediction in case of validation, then model can be considered as over-fitted. In specific, for regression, there are several parameters such PRESS, R^2 pred, AIC, etc. which will give an indication of over-fitting in the model. Increased polynomial functions in the model is also an indication that model may be over-fitted.
this is an excellent question! I would maybe specify a bit: let us have a parametric model with several parameters. How do we conclude that the model is overfitted and how do we recognize which variable contributes the most to this 'overfitting'?
PRESS and Predicted R2 are calculated by omitting each data point from the dependent variable set and then trying to predict it from the regression model. These two parameters will assess the predictive ability of the model. If in case the model is over-fitted with too many variables and/or few data points, the predictive ability of such model will be low and hence the PRESS and Predicted R2 will be low. Further, the AIC parameter helps in building up parsimonious models with few variables and hence gives an indication when model is over-fitted with number of variables to simply to attain higher R2.