Share
Explore BrainMass

Forward selection, backward elimination, and stepwise select

Explain the difference between forward selection, backward elimination, and stepwise selection for the imputation of variables in a regression equation.

Explain how a researcher should interpret a correlation coefficient.

Solution Preview

Stepwise regression is a strategy for model building that examines a fraction of all possible regressions to come up with a model that is pretty good, even if not optimal. Stepwise regression examines one predictor at a time, deciding to put it into the model or leave it out. There are several main varieties of stepwise regression. In the regression module, StatTools has separate options for backward, forward, and regular stepwise regression strategies. I will next discuss these three options in general terms.

1. Backward Stepwise Regression
In backward stepwise regression, begin with the model that has all of the predictors included. Then drop the worst predictor and rerun the regression with the remaining predictors. From the result, drop the worst predictor and rerun the regression with the remaining predictors. At each step, the worst predictor is removed. The model becomes simpler with each step.
?With k predictors, at most k steps are needed to complete the backward stepwise procedure. At most k of the 2k possible models must be examined.
?At each step, the predictor removed is ordinarily the one with smallest absolute T-value (equivalently, largest p-value).
?Justification: Removal of the predictor with least absolute T-value will reduce the R-square by the least of any ...

Solution Summary

The solution explains the difference between forward selection, backward elimination, and stepwise selection

$2.19