Academic journal article Journal of Economics and Finance

Stochastic Volatility Model under a Discrete Mixture-of-Normal Specification

Academic journal article Journal of Economics and Finance

Stochastic Volatility Model under a Discrete Mixture-of-Normal Specification

Article excerpt

Published online: 15 April 2011

© Springer Science+Business Media, LLC 2011

Abstract This paper investigates the properties of a linearized stochastic volatility (SV) model originally from Harvey et al. (Rev Econ Stud 61:247-264, 1994) under an extended flexible specification (discrete mixtures of normal). General closed form expressions for the moment conditions are derived. We show that our proposed model captures various tail behavior in a more flexible way than the Gaussian SV model, and it can accommodate certain correlation structure between the two innovations. Rather than using likelihood-based estimation methods via MCMC, we use an alternative procedure based on the characteristic function (CF). We derive analytical expressions for the joint CF and present our estimator as the minimizer of the weighted integrated mean- squared distance between the joint CF and its empirical counterpart (ECF). We complete the paper with an empirical application of our model to three stock indices, including S&P 500, Dow Jones 30 Industrial Average index and Nasdaq Composite index. The proposed model captures the dynamics of the absolute returns well and presents some consistent and supportive evidence for the Taylor effect and Machina effect.

Keywords Stochastic Volatility Model * Mixtures of Normal * Characteristic Function * Integrated Squared Error * Absolute Return

JEL Classification C01 * C13 * C14

(ProQuest: ... denotes formulae omitted.)

1 Introduction

Beginning with seminal works by Mandelbrot (1963) and Fama (1965), sub- stantial empirical evidence indicates that most time series of returns on financial assets are not normally distributed, but instead exhibit significant leptokurtosis (fat tails relative to the normal distribution) and, in many cases, skewness. More recently, empirical research has indicated that financial time series also display some properties or stylized facts such as time-varying volatil- ity and volatility clustering. These findings have sparked considerable interest in searching for alternative non-normal model specifications to capture these empirical characteristics. A benchmark model was developed by Engle (1982), called the Autoregressive Conditional Heteroscedasticity (ARCH) model. In a standard ARCH model, the conditional variance is a linear function of past squared errors. Bollerslev (1986) proposed a Generalized ARCH (GARCH) specification allowing the conditional variance to be a linear function of both the past squared errors and past conditional variances. Alternatively, Sto- chastic Volatility (SV) models by Taylor (1986) provide another specification of dynamic properties. In the SV framework, the conditional variances are specified to follow some latent stochastic process themselves. Therefore, two innovations give the time-varying characteristics in the SV specifications, while one error process is specified in the GARCH families. There is a vast existing literature discussing statistical properties of both model specifications, such as Kim et al. (1998), Bai et al. (2003), Carnero et al. (2004), etc. In this paper, we will not attempt to conduct a general survey,1 instead, we will provide relevant references relating to our work.

The SV model has an intuitive appeal and realistic modeling specification, but empirical applications have been limited due to the intractability of its likelihood function. More specifically, since the volatility is modeled as a latent variable, the objective likelihood function involves a series of integrals with the dimension of the sample size. As is well-known, it is extremely difficult to solve the integral in any analytical form and consequently, see e.g. Broto and Ruiz (2004), alternative estimation methods have been devised and used for estimating the SV models.

Unlike the likelihood-based methods, those based on moments are rela- tively easy to implement because they avoid the high dimensional integration. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.