10 INTRODUCTION
1.6 A few applications
Perfect simulation m ethods have been successfully applied to a variety o f problems
coming from many different areas. The list of applications includes:
• Markov random fields (specifically auto models). Th ese are labellings of a graph
where the density is a product of functions of the labels on each node, and a
function of the labels on each edge. Examples o f this type of problem include the
Ising model, the Potts model, sink-free orientations of a graph, and the autonormal
model.
• Permutations. A permutation can be thought of as an ordering of n objects. The
set of permutations is easy to sample from, but once the goal is to sample from a
density over p ermutations, the problem can become very difficult.
• Stochastic differential equations. These are used to model continuous functions
that fluctuate randomly. They are used often in finance and as models of physical
systems.
• Spatial point processes. These are used as models for data that is in the form of
points in a space. Examples include the Strauss process, marked Hawkes pro-
cesses, and Mat´ern processes.
• Bayesian posteriors. In Bayesian statistics, a prior is a probability distribution on
parameters of interest. Conditioned on d ata, the prior is then updated using Bayes’
rule to give an unnormalized density that is called the posterior. The normalizing
constant for the posterior is often very difficult to compute exactly. Hence the
ability to sample from the posterior without calculating the normalizing constant
is essential to Bayesian inference. While it is not known how to perfectly sample
from every Bayesian p osterior, many do have perfect simulation algorithms.
• Combinatorial objects. Given a graph G, consider the number of proper colorings
of the graph or the number of sink-free orientations. Sets of combinatorial objects
such as these tend to grow in size exponentially fast in the size of the underlying
graph. For many of these problems, the ability to draw samples from the set (and
restrictions on the set) leads to approximation algorithms for counting the number
of objects in the set. These types of problems can often be linked to problems
coming from physics or statistics.
• Markov processes. Markov chains are powerful modeling tools that describe how
a stochastic process is updated. Under simple conditions, the distribution of the
state of the process will converge to what is known as a stationary distribution.
Sampling from this stationa ry distribution directly can often be difficult. Often
the stationary distribution cannot be described exactly. Examples of these kinds
of problems dealt with here include the antivoter model, self-organizing lists, and
perpetuities.
Each of these applications will be described more fully as we are r eady to present
algorithms for them. One thing all of these applications have in common is that they
can be presented as an unnormalized density for which it is far too expensive com-
putationally to calculate the normalizing co nstant directly.