126 SPATIAL POINT PROCESSES
region S. Suppose that S
is also bounded, and S S
, but it is much easier to sample
from
λ
over S
than from S.
One way to describe sampling from a Poisson point process over S is as sampling
from a point process over S
with density f (x)=1(x S
C
= /0). This density can
then be used with the thinning approach. Points that fall into S
C
are retained with
probability 0, and p oints that fall into S are retained with probability 1. Put more
simply, generate a Poisson point process from over S
, and then only retain those
points that fall into S. The result will be a Poisson point process over S.
7.3 Jump processes
While AR is usually the easiest method to use, the running time for basic AR does
tend to grow exponentially with the size of S. Therefore a more sophisticated method
is needed for generating samples from large windows. What will be used is a special
type of Markovian process called a jump process. Preston [111] gave a method for
creating a jump process whose stationary distribution is a PPP with density f .
Denition 7.5. A process {X
t
}
tR
is a jump process if given X
t
= x, the amount
of time that the process stays at state x has an exponential distribution with mean
1/
α
(x). The state then jumps to a new state using kernel K independent of the past
history of the chain. (This is also known as a continuous-time Markov chain.)
Preston’s framework had two types of moves in the chain.
1. The birth rate b(x,v) is the rate at which a point v is “born” and added to the
conguration.
2. The death rate d (x, v) is the rate at which a point v “dies” and is removed from
the conguration.
The simplest form o f a birth-death chain has d(x, v)=1forallv x. That means
that every point v x is given an exponential random variable of rate 1. At the time
indicated by that exponential random variable, the point v “dies” and is removed from
the conguration.
At the same time, the space S has a birth rate that controls how quickly points are
born. The overall rate is b =
vS
b(x,v ) dx. When a “birth” occurs, the locatio n of
the birth is chosen according to the unnormalized density b(x,·).
Use the notation x + v to indicate the PPP that has all of the points in x plus the
point v. To achieve a stationary density f , it is necessary to balance the rate of births
and deaths.
Theorem 7.2. Suppose
f (x)b (x, v)= f (x + v)d(x + v, v). (7.4)
Then f is a stationary density for the jump process.
[For a proof, see [111].]
When d(x,v)=1forallv x, the goal is to have f (x)b(x,v)= f (x + v).More-
over, the birth rate can be viewed in a Metropolis way, or in a thinnin g way. That is,
JUMP PROCESSES 127
if the birth rate is b
(x,v), and the birth of point v is accepted with probability r(x,v),
then the nal birth rate is effectively b
(x,v)r(x,v).
As in Chap ter 1, the end goal is to have the limiting distribution of the chain be
the Harris distribution.
Denition 7.6. A jump process {X
t
} over Ω is a continuous-time Harris chain if
there exists A, B Ω and
ε
> 0 for x A,y B, and a p robability measure
ρ
with
ρ
(B)=1 where
1. For T
A
= inf{t 0, X
t
A}, (z Ω)(P(T
A
< |X
0
= z) > 0).
2. If x A, then
α
(x)
ε
. Moreover, the amount of time that the state stays in x is
min(T
1
,T
2
),whereT
1
exp(
ε
),T
2
exp(
α
(x)
ε
). If the state jumps after T
1
time, then the new state is drawn from measure
ρ
, and does not depend on x.
Since a continuous-time Harris chain only jumps after a random amount of time,
such chains can never be periodic. The notion o f recurrent is precisely th e same for
these new chains as given in Denition 1 .30. The analogue of Theorem 1 .3 is nearly
identical except aperiodicity is not required.
Theorem 7.3 (Ergodic Theorem for continuous-time Harris chains). Let X
n
be a re-
current continuous-time Harris chain with stationary distribution
π
.IfP(R < |X
0
=
x)=1 for all x, then as t , for all measurable sets C and starting points x:
|P(X
t
C|X
0
= x)
π
(C)|→0. (7.5)
So to show that the limiting distribution of a Preston birth-death chain is the
stationary distribution, it sufces to show that the chain is recurrent.
7.3.1 Example: Strauss process
Consider how the Preston chain works for the Strauss process. The rate of deaths is
always 1. The initial rate of births is
β
. Then when a point v attempts to be born,
it is accepted and added to the conguration x with probability
γ
p
R
(x+v)p
R
(x)
.This
makes the effective birth rate b(x, v)=
βγ
p
R
(x+v)p
R
(x)
, and hence (7.4) is satised.
This is implemented in the following pseudocode. Here m is Lebesgue measure
for S a bounded subset of R
n
.
Strauss update Input: current state x, Output: next state x
1) Draw T
B
Exp(
βν
(S))
2) Draw T
D
Exp(#x)
3) If T
D
< T
B
(then there is a death)
4) Draw v uniformly from x, x x v
5) Else
6) Draw v over m(S)
7) Draw U Unif([0,1])
8) If U <
γ
#{wx:dist(v,w)R}
9) x x + v
128 SPATIAL POINT PROCESSES
7.3.2 Locally stable
The Strauss process d ensity has the property of being locally stable.
Denition 7.7. A PPP density is locally stable if there exists a constant M such that
for all congurations x and points v, f (x + v) Mf(x).
That is, addition of a point does not change the value of the density by mor e than
a constant factor. For the Strauss process, M =
β
.
In general, for a locally stable process over a state space S with Lebesgue measure
m(S), the update procedure is as follows.
Locally stable update Input: current state x, Output: next state x
1) Draw T
B
Exp(M ·m(S))
2) Draw T
D
Exp(#x)
3) If T
D
< T
B
(then there is a death)
4) Draw v uniformly from x, x x v
5) Else
6) Draw v uniformly over S
7) Draw U Unif([0,1])
8) If U < f (x + v)/[Mf(x)]
9) x x + v
Exponentials have the nice property that the probability that one exponential is
smaller than an independent exponential is proportional to the rates.
Fact 7.1. Suppose A
1
Exp(
λ
1
) and A
2
Exp(
λ
2
) are independent exponentially
distributed random variables. Then P (A
1
< A
2
)=
λ
1
/(
λ
1
+
λ
2
).
So if one just keeps track of the number of points in the PPP, then if this number
is i,thereisani/[i+ Mm(S)] chance of a death that reduces i by 1, and a Mm(S)/[1+
Mm(S)] chance of a birth that increases i by 1.
Fact 7.2. Consider a Markov chain on {0,1,2 ,...} where there is a constant C such
that P(N
t+1
= i + 1|N
t
= i)=C/[C +i] and P(N
t+1
= i 1|N
t
= i)=i/[C + i]. Such
a chain is aperiodic and recurrent, returning to the state i = 0 innitely often.
The idea is that since C is xed, the chance of moving upwards goes to 0 as the
state increases. Hence the state never becomes too large. See for instance [84] for a
formal proof.
What this means is that the dominating process will return to the empty set in-
nitely often. That implies the following result.
Fact 7.3. This locally stable update gives an aperiodic, recurrent Harris chain with
stationary density f , so the continuous time analogue of Theorem 1.3 guarantees
that the distribution of the state of the process a s t goes to innity will approach the
stationary distribution.
7.4 Dominated coupling from the past for point processes
Kendall and Møller [82] noted that implicit in the Locally
stable update is a
dominating process that allows the use of dominated CFTP. For an evolving PPP X
t
,
DOMINATED COUPLING FROM THE PAST FOR POINT PROCESSES 129
the dominating process Y
t
contains every point in X
t
and possibly some extra ones
because it allows every birth into the state.
Locally stable dominating update Input: x, y Output: x, y
1) Draw T
B
Exp(M ·m(S))
2) Draw T
D
Exp(#y)
3) If T
D
< T
B
(then there is a death)
4) Draw v uniformly from y, y y v, x x v
5) Else
6) Draw v uniformly over S
7) Draw U Unif([0,1])
8) y y + v
8) If U < f (x + v)/[Mf(x)]
9) x x + v
This update function is monotonic. To see why, suppose that the input satises
x y.WhenT
D
< T
B
, that means that a point in y is about to die. Since x cannot have
a birth without a birth in y, the death of the point in y must also come before any birth
in x. If this point in y that is about to die is also in x, then it also dies in x. (Note that
if v / x then x v = x). Either way, after the death event it still holds that x y .
On the other hand, if T
B
T
D
, then there is a point v that is added to y. It might
or might not (depending on the value of U) be added to x, but it is always added to y .
Hence, at the end of a birth event, x y.
So Y
t
is a dominating process for X
t
. Moreover, Y
t
is a very simple birth-death
process. The stationary distribution for the Y
t
process is a PPP with rate
λ
(x)=M
for all x S, so it is easy to generate a sample from Y
t
.
Formally Y
t
is a marked process, because for each point in Y
t
, there is a choice of
uniform U
t
called a mark. The marks are sufcient to determine the X
t
,soitisthe
marked Y
t
and X
t
processes that are coupled together.
Now in order to used dominated CFTP, it is also necessary to be able to run the
dominating pr ocess Y
t
backwards in time. This is straightforward from a computa-
tional point of view, since the dominating process is reversible. However, it can get a
bit confusing, as when time is running backward, a “birth” in the reverse process is
a “death” in the forward process. And a “death” in the backward process is a “birth”
in the forward process.
The dominating process can be thought of as maintaining a list of birth and death
events that happen backwards in time. Call this list EL for event list. Let D(0) denote
the conguration of the dominating process at time 0. Then sometime before time 0,
a birth or death event occurred. Let D(1) denote the state of the dominating process
immediately before this event occurred.
Similarly, let D(i) denote the state of the dominating process just before time
t
i
, where exactly i birth o r death events o ccurred in interval [t
i
,0). Suppose that the
current event list holds the rst n events that happened b efore time 0. This is neces-
sary to be able to extend this list backwards in time to n
> n events. This is done by
the following pseudocode:
130 SPATIAL POINT PROCESSES
Generating event list
Input: event list EL of length n, D(n), n
> n
Output: event list EL of length n
, D(n
)
1) For i from n to n
1do
2) Draw B Exp(#D(i)),DrawD Exp(M
ν
(S))
3) If B < D (then there is a birth)
4) Draw v uniformly from x, U Unif([0 , 1])
5) Add [birth vU] to EL and set D(i 1) D(i) v
5) Else
6) Draw v over
ν
(S)
7) Add [death v 0] to EL and set D(i 1) D(i)+v
Once the birth event list is generated, then run the chain forward using the dom-
inated process. If the chain couples from all starting states, then the resulting state
must be stationary. The pseudocode is as follows.
Dominated cftp for spatial point processes Output: X f
1) Star t with EL empty
2) Draw N Pois(M
ν
(S))
3) Draw D
1
,...,D
N
independently from
ν
(·) over S
4) Let D(0) ←{D
1
,...,D
N
}
5) n 1, n
0
6) Repeat
7) (EL,D(n)) Generating
event list(EL,D(n
))
8) For every X (n) D(n)
9) Run X (n) forward to X(0) using the events in EL
10) Until all the resulting X(0) values are the same
11) Output X (0)
Theorem 7.4 (Kendall and Møller [82]). The above procedure generates X f with
probability 1.
Proof. Set Y
n
(n)= /0foralln, and use the top n events in the EL list to move Y
n
forward to Y
n
(0). Then as the number of events grows to innity, the amount of time
in the chain for the Y
n
process is going to innity, so in the limit as n goes to innity,
the distribution of Y
n
(0) is going to the stationary d ensity f from Theorem 7.3.
Now let T = min{n : D(n)= /0}. Then for n > 0, Y
n
(0) is always the same, and
equal to X, the output of dominated CFTP. But the only way that a sequence o f equal
random variables can converge (in measure) to density f , is if every single one has
density f .
As with regular CFTP, it is usually not necessary to actually run the X (n)
process forward from every single starting state. Instead, a bounding chain that is
coupled with the dominating chain can be used to test coalescence. The bounding
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset