Working Paper 2004-3 February 2004
Abstract: This paper compares two methods for undertaking likelihood-based inference in dynamic equilibrium economies: a sequential Monte Carlo filter proposed by Fernandez-Villaverde and Rubio-Ramirez (2004) and the Kalman filter. The sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. The authors report two main results. First, both for simulated and for real data, the sequential Monte Carlo filter delivers a substantially better fit of the model to the data as measured by the marginal likelihood. This is true even for a nearly linear case. Second, the differences in terms of point estimates, even if relatively small in absolute values, have important effects on the moments of the model. The authors conclude that the nonlinear filter is a superior procedure for taking models to the data.
JEL classification: C63, C68, E37
Key words: dynamic equilibrium economies, the likelihood function, the sequential Monte Carlo filter, the Kalman filter
Recently, a growing literature has focused on the formulation and estimation of dynamic equilibrium models using a likelihood-based approach. Examples include the seminal paper of Sargent (1989), and more recently, Bouakez, Cardia and Ruge-Murcia (2002), DeJong, Ingram and Whiteman (2000), Dib (2001), Fernandez-Villaverde and Rubio-Ramirez (2003), Hall (1996), Ireland (2002), Kim (2000), Landon-Lane (1999), Lubik and Schorfheide (2003), McGrattan, Rogerson and Wright (1997), Moran and Dolar (2002), Otrok (2001), Rabanal and Rubio-Ramirez (2003), Schorfheide (2000), and Smets and Wouters (2003a and 2003b), to name just a few. Most of these papers have used the Kalman filter to estimate a linear approximation to the original model.
This paper studies the effects of estimating the nonlinear representation of a dynamic equilibrium model instead of working with its linearized version. We document how the estimation of the nonlinear solution of the economy substantially improves the empirical fitting of the model: The marginal likelihood of the economy, i.e., the probability that the model assigns to the data, increases by two orders of magnitude. This is true even for our application, the stochastic neoclassical growth model, which is nearly linear. We also report that, although the effects of linearization on point estimates are small, the impact on the moments of the model is of first order importance. This finding is key for applied economist because quantitative models are widely judged by their ability to match the moments of the data.
Dynamic equilibrium models have become a standard tool in quantitative economics (see Cooley, 1995, or Ljungqvist and Sargent, 2000, for summaries of applications). An implication of these models is that they can be described as a likelihood function for observables, given the model's structural parameters- those characterizing preferences and technology.
The advantage of thinking about models as a likelihood function is that, once we can evaluate this likelihood, inference is a direct exercise. In a classical environment we only need to maximize this likelihood function to get point estimates and standard errors. A Bayesian researcher can use the likelihood and her priors about the parameters to find the posterior. The advent of Markov chain Monte Carlo algorithms has facilitated this task. In addition, we can compare models by likelihood ratios (Vuong, 1989) or Bayes factors (Geweke, 1998) even if the models are misspecified and nonnested.
The previous discussion points out the need to evaluate the likelihood function. The task is conceptually simple, but its implementation is more cumbersome. Dynamic equilibrium economies do not have a "paper and pencil" solution. …