|•||Problems for Statistical Inference: Bouncing Betas|
Multicollinearity is a big word to describe a simple concept; it means that independent variables are correlated (e.g., rX1X2 < > 0) in multiple correlation-regression. Chapter 11 explained that the multiple coefficient of determination (P2YX1X2) is not equal to the sum of the simple coefficients of determination (ρ2YX1 + ρ2YX2) when there is multicollinearity in a two-independent variable case. Typical consequences of multicollinearity are shown in Exhibit 18.1 where multicollinearity between X1 and X2 is shown by the area d + d'. Some variance that each explains in Y is redundant (the area d). Consequently, the total variance explained in Y is less than the simple sum of the variance that each Xi explains. This two- independent-variable case generalizes to three or more independent variables.
This chapter addresses additional multicollinearity issues. A major issue involves problems created by multicollinearity for statistical inference.