142 THE RANDOMNESS RECYCLER
on {1,2,3}. In this chain the transition probabilities p(a, b)=P(X
t+1
= b|X
t
= a)
are p(1 , 1)=p(1, 2)=p(2,1)=p(2, 3)=p(3,2)=p(3,3)=1/2.
The stationary distribution for this Mar kov chain is un iform over {1, 2,3}.
Therefore, a simple way of generating a stationary stopping time is to let Y ∼
Unif({1,2,3}) andthenletT
1
= inf{t : X
t
= Y }.ThenX
T
= Y and so h as the cor-
rect distribution, but T
1
and X
T
1
are not independent conditioned on X
0
. For instance,
when X
0
= 1, P(T
1
> 2|X
t
= 3)=1butP(T
1
> 2 ) < 1. So this is an example of a
stationary time, but not a strong stationary time.
Here is an SSST for this Markov chain. Say that the chain tries to move right
when X
t+1
> X
t
,orwhenX
t+1
= X
t
= 3. Denote this type of step with an r. Similarly,
the chain tries to move left when X
t+1
< X
t
or X
t+1
= X
t
= 1. Denote this with an .
So if the chain started at X
0
= 2andthefirst four steps where rr,thenX
4
= 2as
well.
The SSST looks at the chain in groups of two moves. First, wait until X
3k
= 1for
some integer k. Next, lo ok at the next two moves of the chain. If they are either rr,
,orr, then since each of these are equally likely to occur, X
3k+2
is equally likely
to be in {1,2,3}.
That is, for
T
2
= inf{3k + 2:X
3k
= 1, and the next two moves are either rr, , or r}, (8.1)
then X
T
2
∼ Unif({1, 2,3}). Moreover, knowing the value of T
2
does not change the
distribution of X
T
2
, even when conditioned on X
0
.
8.1.1 Example: SSST for simple symmetric random walk on {1,2,...,n}
Now extend the walk to {1,2,...,n} so that p (a,a −1)=p(a,a + 1)=1/2forall
a ∈{2,...,n −1},andp(1,1)=p(n,n)=1/2.
To extend the SSST to this more com plicated chain, it is easiest to consider
how the distribution of X
t
evolves over time. Suppose X
0
= 1. This can be rep-
resented by the probability vector that places 100% of the probability on state 1,
namely, (1, 0, 0,...,0). After one step in the ch ain, the probability vector will be
(1/2,1/2, 0,0,...,0) indicating that P(X
1
= 1|X
0
= 1)=P(X
1
= 2 |X
0
= 1)=1/2,
so now X
1
∼ Unif({1, 2}).
Now take one more step in the chain. The probability vector is now
(1/2,1/4, 1/4,0,0 ,...,0).ThestateX
2
is no longer uniform over a set of states.
However, the distribution of X
2
is the equal mixture of two uniforms, one uniform
over {1,2,3}, and one uniform over {1}. See Figure 8.1.
Given X
2
= x,drawY |X
2
= x as uniform over [0,P(X
2
= x)].Then(X
2
,Y ) ∼
Unif({(x,y) :0≤ y ≤ P(X
2
= x)}).
Note that [X
2
|Y ≤ 1/2] ∼ Unif({1, 2,3}). Similarly, [X
2
|Y > 1/2] ∼ Unif({1}).
In general, suppose that X
t
∼ Unif({1,...,i}) for i ∈{2 ,...,n −1}.ThenX
t+1
will be a mixture of the uniform distribution over {1,...,i −1} and the uniform
distribution over {1,...,i +1}).WhenX
t+1
> i−1, then X
t+1
must have been a d raw
from the the latter distribution. When X
t+1
≤i −1, there is a 50% chance that X
t+1
is
uniform over {1,...,i −1}, and a 50% chance that it is uniform over {1,...,i + 1}.