Dynamical Conditional Independence Models and Markov Chain Monte Carlo Methods

By Berzuini, Carlo; Best, Nicola G. et al. | Journal of the American Statistical Association, December 1997 | Go to article overview
Save to active project

Dynamical Conditional Independence Models and Markov Chain Monte Carlo Methods

Berzuini, Carlo, Best, Nicola G., Gilks, Walter R., Larizza, Cristiana, Journal of the American Statistical Association


We develop sampling-based methods for models of observations that arise sequentially. Our interest is in applications where analysis of incoming data is required in real time, such as in clinical monitoring. We suppose that the model expands by progressively incorporating new data and new parameters. For example, in clinical monitoring, new patient-specific parameters are introduced with each new patient. Without loss of generality, we imagine that observations [F.sub.1], [F.sub.2], . . ., [F.sub.t], . . . arrive at integer times t = 1, 2, . . ., t, . . .. At each time t, the new data [F.sub.t] are accompanied by a (possibly empty) set of new model parameters or missing data [[Phi].sub.t]. Thus the model for [F.sub.1], . . ., [F.sub.t] comprises unknowns [[Phi].sub.1], . . ., [[Phi].sub.t]. In such a dynamic model (DM), data or parameters incorporated at one expansion stage may at later stages become uninteresting in themselves, such as when a patient dies or is discharged. (Herein, we use the word "parameter" to mean any model unknown, including missing data.)

Sampling-based methods of Bayesian inference and prediction include importance sampling and Markov chain Monte Carlo (MCMC). Suppose that at time t we have a sample [H.sub.t] of values of ([[Phi].sub.1], . . ., [[Phi].sub.t]) from the posterior distribution [Pi]([[Phi].sub.1], . . ., [[Phi].sub.t][where][F.sub.1], . . ., [F.sub.t]). Arrival of a new data item [F.sub.t+1] shifts interest to the new posterior [Pi]([[Phi].sub.1], . . ., [[Phi].sub.t+1][where][F.sub.1], . . ., [F.sub.at+1]), prompting us to generate a new sample [H.sub.t+1] of values of ([[Phi].sub.1], . . ., [[Phi].sub.t+1]) from [Pi]([[Phi].sub.1], . . ., [[Phi].sub.t+1][where][F.sub.1], . . ., [F.sub.t+1]). When computing the new sample [H.sub.t+1], it seems sensible to try to use information contained in the available sample [H.sub.t]. Under conventional MCMC sampling, this is not possible; with each new data item, the available sample of parameter values must be discarded, and a new sample must be created by restarting the MCMC from scratch on the entire model. This waste of information causes responses to new data to become slow. In particular, it hampers application of the method in real-time contexts.

The aforementioned difficulty can be avoided by adopting sampling methods other than MCMC. Kong, Liu, and Wong (1993; henceforth KLW) proposed a method for sequential updating of posterior distributions based on importance sampling. They retained the original parameter sample [H.sub.0] throughout and took incoming information into account by dynamically adapting the importance weights associated with elements of [H.sub.0]. However, their method is not directly applicable to DMs with an expanding parameter space. Smith and Gelfand (1992) proposed a sampling - importance resampling (SIR) sequential updating scheme. Gamerman and Migon (1993) discussed sequential analysis of data within a dynamic hierarchical model that is a special case of our DMs. They obtained closed forms for the posterior and predictive distributions of interest. In doing this, they assumed knowledge of variance matrices (up to a scalar factor), linearity of the structural equations, and error normality. West (1991, 1993) considered sequential analysis of a special case of our DMs, through a sampling-based method that uses kernel density reconstruction techniques coupled with importance resampling.

We propose two methods that are in some respects developments of KLW's work. The first adapts an importance sampling approach to expanding parameter spaces, and the second combines importance sampling and MCMC sampling. Both methods exploit conditional independence between groups of model parameters, allowing sampled values of parameters that are no longer of interest to be discarded.

In Section 2.1 we assume a general conditional independence structure for a DM, which we describe using a graph (as in Whittaker 1990).

The rest of this article is only available to active members of Questia

Sign up now for a free, 1-day trial and receive full access to:

  • Questia's entire collection
  • Automatic bibliography creation
  • More helpful research tools like notes, citations, and highlights
  • Ad-free environment

Already a member? Log in now.

Notes for this article

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
Loading One moment ...
Project items
Cite this article

Cited article

Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited article

Dynamical Conditional Independence Models and Markov Chain Monte Carlo Methods


Text size Smaller Larger
Search within

Search within this article

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

While we understand printed pages are helpful to our users, this limitation is necessary to help protect our publishers' copyrighted material and prevent its unlawful distribution. We are sorry for any inconvenience.
Full screen

matching results for page

Cited passage

Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited passage

Welcome to the new Questia Reader

The Questia Reader has been updated to provide you with an even better online reading experience.  It is now 100% Responsive, which means you can read our books and articles on any sized device you wish.  All of your favorite tools like notes, highlights, and citations are still here, but the way you select text has been updated to be easier to use, especially on touchscreen devices.  Here's how:

1. Click or tap the first word you want to select.
2. Click or tap the last word you want to select.

OK, got it!

Thanks for trying Questia!

Please continue trying out our research tools, but please note, full functionality is available only to our active members.

Your work will be lost once you leave this Web page.

For full access in an ad-free environment, sign up now for a FREE, 1-day trial.

Already a member? Log in now.

Are you sure you want to delete this highlight?