Data was analysed by SPSS software and the authors mentioned that in the multivariate logistic regression analysis they used forward elimination method
Truth, or using a valid method, is not what most people believe or do. "Too often, we
learn each others' mistakes rather than learning from each others' mistakes".(G. King, 1986, How not to lie with statistics, American Journal of Political Science 30:666-687).
Including or dropping variables does not strengthen "the" model, but "a" model. There is a ranking and selection problem here: the ranking of the IV's will change from sample to sample and stepwise selection in the sample at hand will favor IV's that only appear to perform better than they'll do in "real life". On the topic of selection criteria en selection bias see: Zucchini W (2000) An Introduction to Model Selection. Journal of Mathematical Psychology 44:41-61
Sssso I'm with Bruce (again): stop stepping, start solving :-)
The only way to be certain about it is to ask the authors themselves. If you are the journal referee, you may also want to ask the authors to specify the type of analysis (Conditional, Wald or Likelihood ratio).
I agree with what Adrian and Zeljko have said. In addition, if you are a journal referee, you may wish to alert the authors to the fact that stepwise selection methods are generally frowned upon nowadays. E.g.,
Nowadays, some authors (e.g., Frank Harrell) consider the lack of any non-significant variables in a multivariable regression model a pretty good sign of over-fitting. See the "Lack of insignificant variables"[1] section in this author checklist, for example:
The regression analysis assigns scores to each model and allow us to choose the model with the best score. We assume that all the explanatory uses a search algorithm, such as Forward selection or Backward elimination, so as to find the best model.
In forward inclusion method, predictors are included into the regression model one by one based on F ratio. Order in which variables are included is based on the contribution of explained variance.
In backward elimination method, All the predictors are included in the regression equation. Predictors are then eliminated one at a time based on F ratio.
As has been stated, forward stepwise adds variables into your final model, while backwards, removes variables. It is generally thought that forward stepwise is preferred. You should check the SPSS technical information to understand their criteria for inclusion. elimination.
@ Marta, re "It is generally thought that forward stepwise is preferred."
Generally preferred BY WHOM? Here are some links I posted earlier in the thread where clear preference is to avoid algorithmic selection methods altogether! ;-)
It's true that criticism of both methods have emerged in the last several years - and yes, we want to avoid over-fitting a model that won't work/hold with out-of-sample data. However, many of us still use stepwise regression - and it is certainly an important techniques taught in all regression classes. My understanding, of the two methods is that forward includes variables that strengthen the model, while backwards drops variables to strengthen the model.
I agree with it.most of the time we are using stepwise regression wether it is forward or backward but purpose is to get the solid result through this model
Truth, or using a valid method, is not what most people believe or do. "Too often, we
learn each others' mistakes rather than learning from each others' mistakes".(G. King, 1986, How not to lie with statistics, American Journal of Political Science 30:666-687).
Including or dropping variables does not strengthen "the" model, but "a" model. There is a ranking and selection problem here: the ranking of the IV's will change from sample to sample and stepwise selection in the sample at hand will favor IV's that only appear to perform better than they'll do in "real life". On the topic of selection criteria en selection bias see: Zucchini W (2000) An Introduction to Model Selection. Journal of Mathematical Psychology 44:41-61
Sssso I'm with Bruce (again): stop stepping, start solving :-)
I prefer backward selection for the same reason. Don't forget that it is just as important to consider potential confounders and important variables from a clinical standpoint - variables that don't show significance during the modeling phase may be hugely important clinically and should be included regardless of what your selection tool tells you.
The both backward and frontward selection or removal methods are used to find the influence of potential confounders (independent variables) and statistical significance on the dependant variables. it is built in or with in Linear Multivariate regression model to see the stepwise regression analysis to see the influence of single variable and multiple variables and also compound variables or factors.
In my case to investigate the factors affecting engine vavetrain noise, backward elimination is the easiest method. It is also more practical and better for those who are investigating many factors since backward elimination has the capability to predict the joint behaviors. In backward elimination, it starts with all predictors in the model (Top down)
Forward selection is just the reverse of backward method but it start with no predictors (Bottom up)