Link Search Menu Expand Document

Table of contents

  1. Markov Chain Monte Carlo
    1. Definition
    2. Links

Markov Chain Monte Carlo

Definition

A Markov chain is a sequence of random variables \(X_1, X_2, \ldots\) for which the distribution of \(X_{k + 1}\) depends only on the value of \(X_k\) and not on any earlier values in the chain. A realization of a Markov chain may be represented using a trace plot, that is, a plot in which the values of the Markov chain are plotted against the iteration number. Under suitable conditions, the values in the realization of a Markov chain will eventually settle down, or converve, to an equilibrium distribution.