Evaluating BLS Labor Force, Employment, and Occupation Projections for 2000: In 1989, BLS First Projected Estimates for the Year 2000 of the Labor Force, Employment, and Occupations; in Most Cases, the Accuracy of BLS Projections Were Comparable to Estimates from Naive Extrapolated Models
Stekler, H. O., Thomas, Rupin, Monthly Labor Review
The purpose of any evaluation of economic forecasts is to find the sources of the errors and to improve future forecasts. The errors may result from internal procedures, assumptions, or methods, and from external inputs. (1) Moreover, because the forecasts are intended to be used for some function or purpose, the evaluation should pose questions that determine how well the predictions fulfilled this intended purpose.
Thus, for a forecast evaluation to be valuable, it must pose the right questions that need to be addressed. This is true whether the forecasts are short-term macroeconomic predictions or the long-term BLS projections of labor force, employment, and occupation trends. However, an evaluation of these BLS long-term projections poses three methodological issues that usually are not encountered in analyses of short-term macroeconomic forecasts. First, no other organization made projections of these variables. Consequently, there is no benchmark for judging the BLS forecasts. Second, these projections are long-term rather than the short-term macroeconomic forecasts that have been evaluated in the past. Thus, the questions that must be addressed in this evaluation can differ from those addressed in the macro forecasts. Finally, this is a one-time forecast--that is, the evaluation is concerned with the BLS projections for a single year, 2000--while most forecast evaluations have examined multiple forecasts.
This article evaluates the labor force, employment by industry, and occupation projections that BLS made in 1989 for the year 2000. (2) While these forecasts have already been evaluated individually, (3) it is possible to both ask additional questions that were not addressed in those studies and to use evaluation methodologies different from those employed previously. In addition, this article, whenever possible, uses the same methodologies to evaluate the projections of all three of these variables.
Because there are no other forecasts that are comparable to the BLS projections, it is necessary to construct a benchmark for the projections of each variable. In each case, BLS projections are compared with similar data obtained from the forecasts of a benchmark. The benchmarks that were selected all use data that were available at the time when BLS projections were prepared. In actuality, the benchmarks are naive models such as: (1) projecting the latest available information; or (2) predicting that the change over the forecast period is equal to that observed over the previous time interval, which is of the same length as the forecast period. (4)
Because the projections that are being analyzed in this article were prepared in 1988, the forecast period is 12 years in length. Consequently, the change from 1976 to 1988 was used as the basis for this benchmark.
At a minimum, the BLS projections should be more accurate than the forecasts of these naive models.
Long-term projections vs. short-term forecasts. The questions that are appropriate for evaluating the short-term forecasts have been examined in detail, (5) but the questions that should be asked in analyzing longer run projections have not been given the same degree of attention. Because BLS projections primarily focus on long-run trends, the questions asked and the statistics used in evaluating these forecasts should be related to the primary emphasis of the forecast. Thus, the two basic questions to be asked in evaluating these projections are: (1) Have the trends, especially structural changes, been predicted correctly? (2) Were these forecasts better than those that could have been produced by a benchmark method? Additional questions such as what the sources of the errors were and if the forecasts improved over time can also be posed.
The statistics that can answer these questions include the following: (1) the percentage of components where the direction of change was predicted correctly; (2) dissimilarity indexes that measure the structure of the labor force, and so forth; (3) contingency tables that determine whether the actual and predicted directions of change are related; and (4) Spearman Rank Correlation Coefficients that measure the relationship between the predicted and actual changes of the components of an aggregate forecast. …