Forecast Comparisons

Article excerpt

This paper compares the National Institute of Economic and Social Research (NIESR) forecasts for output, inflation and key public sector finance variables against the corresponding forecasts from HM Treasury (HMT), the Bank of England (Bank) and the Institute for Fiscal Studies (IFS). We find that NIESR outperforms, on average, other major bodies in its forecasts for output and in particular inflation where simple scores are used. It also performs well on the forecasting of the government current budget surplus but not public sector net borrowing. Statistical estimates of accuracy provide a less clear picture but their reliability is blighted by the small sample size.

Keywords: evaluating forecasts; output forecasting; inflation forecasting; public sector finance forecasting

JEL classification: C53, E17

Introduction

In this note we look at forecasts of output growth, inflation, the government current surplus and public sector borrowing undertaken over the past eight years by the National Institute (NIESR), the Treasury (HMT), the Bank of England (Bank) (for output and inflation) and the Institute for Fiscal Studies (IFS) (for the surplus and borrowing). We compare these forecasts to the first outcome for the data, as has been common in other such exercises, (1) as these are the data that will induce policymakers to change their actions. For each we calculate the Mean Absolute Deviation (MAD), the Root Mean Squared Deviation (RMSD) and also a score (which is nearer). Given the short data period, the score (always reported as THEM:US) is revealing.

The Treasury produces two types of forecasts, and it appears that those in the spring Budget get marginally higher scores than the NIESR spring forecast, as we can see from table 1, whilst NIESR normally performs better in the autumn as compared to the Pre-Budget Report. (2) In particular, NIESR scores as highly in the autumn on its forecast of the target public sector current budget surplus for the next fiscal year. This is probably the single most important indicator when setting plans for fiscal policy for the next year. For both output and especially inflation NIESR marginally outscores the Bank, as we can see from table 1, at least for the two forecasts considered. Compared with the IFS, overall scores on public sector forecasts are even, but NIESR scores noticeably better when evaluating the target public sector current surplus for the next fiscal year. Compared with the others on the table, NIESR on average has a higher score for output (12-13) and especially inflation (9-14), and it has a higher score on the target current surplus for next year (9-11). With respect to inflation, the evaluation of NIESR's inflation forecast on pp. 60-69 of this Review suggest that its point forecast is reliable.

Methods of comparing forecasts

Forecast comparisons are always fraught with difficulty, especially over such short periods. Single forecasts can distort comparisons over such periods, and we have to find a way to reduce the impact of outliers. The 'score' is probably the least sensitive to outliers, followed by the MAD. The RMSD is considered the industry standard, and can most easily be associated with a (quadratic) loss function for evaluating the costs of forecast errors. With eight observations it is not always easy to interpret comparisons for this indicator, although they are of value. Different forecasters use different degrees of rounding and comparisons can be spuriously accurate if one uses intervals and the other uses point forecasts, and hence any evaluation can be disputed. Where intervals are reported we use their midpoint. In addition, forecasts are produced for different purposes, and hence can use marginally different indicators. HMT (and the IFS) forecast seasonally unadjusted public sector indicators for the fiscal year, whilst NIESR forecasts seasonally adjusted National Accounts consistent public sector outcomes, and hence each has to be evaluated against the relevant first data. …