Some of the tables in this Appendix provide the results of multiple regression models. Multiple linear regression analysis involves the development of models that use independent variables that may be associated with the dependent variable of interest; such as mortality rates by MSA or PMSA. If the association (measured by the correlation coefficient, or Pearson r) between two independent variables is too strong (e.g., 0.90 or greater), then the problem of "multicolinearity" must be addressed, usually by excluding one of the two intercorrelated variables. The independent variables are included in a model in which the association between each independent variable and the dependent variables is adjusted for the (linear) effects of the other independent variable(s) in the model. If certain statistical assumptions are met, a statistical test (based on the Gaussian or normal distribution) can be applied to each regression coefficient. This tests the "null" hypothesis that the coefficient is zero, or that the independent variable has no association with the dependent variable, when other independent variables are included in the model.
The results of the analysis usually include estimates of the regression coefficients for each independent variable, the t value or ratio of the regression coefficient to its standard error, and an intercept value or constant for the equation. R2 (R squared) is a measure of how well the model "fits" the data, and the proportion of variation in the dependent variable that is "explained" by the model. The R2 values in the tables are "adjusted" or more conservative measures of goodness of fit of the model. Some computer "packages" commonly used for multiple linear regression analysis are summarized by Afifi and Clark ( 1984).