82 Handbook of Discrete-Valued Time Series
of Bougerol and Picard (1992b), which has found use among other things in GARCH theory,
Bougerol and Picard (1992a), Francq and Zakoïan (2004). Under the additional condition
that
C
N
p(dN)< and
ρ[f
N
(x
0
), x
0
]p(dN)< for some x
0
S, there exists (Theorem
1.1 of Diaconis and Freedman (1999)) (1) a unique stationary distribution π for {X
t
} and
such that (2) ρ
[P
t
(x, ·), π]≤ A
x
r
t
where ρ
is the Prokhorov metric induced by ρ, P
t
(x, dy)
is the law of X
t
given X
0
= x such that 0 < A
x
< ,and0 < r < 1 not depending on t or x.
The conditions of Diaconis and Freedman (1999), Theorem 5.1 are somewhat more
relaxed than those of Theorem 1.1. This is achieved by securing an appropriate tail behav-
ior. This is the so-called algebraic tail behavior condition where a random variable U has
an algebraic tail behavior if P(U > u)<α/u
β
for all u > 0, α, β > 0.
For the proof of these results we refer to Diaconis and Freedman (1999), but we
briey mention the role of the backward iterated process, since this concept also plays an
important role in other proofs in the literature that will be mentioned in the sequel.
The forward iterated process is X
t
(x) dened by (writing f
t
for f
N
t
)
X
t
(x) = f
t1
(X
t1
) = (f
t1
f
t2
···f
0
)(x), X
0
= x,
and this in general does not converge (almost surely). However, the backward iterated
process
B
t
(x) = (f
0
f
1
···f
t1
)(x)
does converge almost surely to a limit independent of x. Since X
t
and B
t
have the same
distribution, this can be used as a key element in proving the existence of a stationary
measure. To understand intuitively why the backward process converges and the forward
not, consider the example of an ordinary autoregressive process
X
t
= aX
t1
+ ε
t1
X
0
= x, |a| < 1.
Here ε
t
plays the role of N
t
. Forward iteration yields
X
t
= ε
t1
+ aε
t2
···+a
t1
ε
0
+ a
t
X
0
,
and this does not converge almost surely since there is always new random variation intro-
duced by the term ε
t1
, and we just have convergence in distribution. On the other hand,
the backward iterated process is given by
B
t
= ε
0
+ aε
1
+···+a
t1
ε
t1
+ a
t
X
0
,
and this converges almost surely to B
=
s=0
a
s
ε
s
since here the new randomness ε
t1
is
damped down by a
t1
. Neither of the two processes X
t
or B
t
are stationary due to a xed
initial condition, but a stationary process can be introduced as follows in the general case.
We let
...f
2
, f
1
, f
0
, f
1
, f
2
, ...
be independent with common distribution p and
W
t
= lim (f
t1
f
t2
···f
ts
)(x).
s→∞
83 Count Time Series with Observation-Driven Autoregressive Parameter Dynamics
Then {W
t
} is stationary with the transition probability of {X
t
},and B
is distributed like
any of the W
t
. For the AR process, W
t
is dened as
W
t
= a
s
ε
t1s
.
s=0
(Note that {W
t
} is not the backward process and does not converge almost surely.)
The log condition
ln C
N
p(dN)<0 is difcult to check for a given model. Wu and Shao
(2004) replace this condition by conditions that are easier to verify. Using the same notation
as in Diaconis and Freedman (1999), they introduce the following two conditions:
(i) There exists an x
0
S and α > 0 such that
E[ρ(x
0
, f
N
(x
0
))
α
]=
(ρ(x
0
, f
N
(x
0
))
α
p(dN)<
(ii) There exists an y
0
S, α > 0, r(α) (0, 1) and C(α)>0 such that
E[ρ(X
t
(x), X
t
(y
0
))
α
]≤C(α)(r(α))
t
ρ(x, y
0
)
α
for all x S and t N .
Condition (i) corresponds to the condition
ρ(f
N
0
(x
0
), x
0
)p(dN)<in Diaconis and Freed-
man (1999), but is weaker. Condition (ii) replaces
ln C
N
p(dN)<0 and is also a contraction,
“geometric moment contracting,” condition inthe terminology of Wu and Shao (2004).
Under these conditions there exists a stationary measure π (unique). Moreover, Wu and
Shao are able to say something about the convergence of the backward iterated process to
B
, namely
E[ρ(B
t
(x), B
)
α
]≤Cr
t
(α)
where C > 0 depends solely on x, x
0
, b
0
and α, and where 0 < r(α)< 1. This mode of
convergence can subsequently be exploited to obtain central limit theorems and asymptotic
theory. The extension of a process from t = 0, 1, 2, ...to t = 0, ±1, ±2, ...as outlined earlier
plays a role in this. Application to autoregressive count processes are considered by Davis
and Liu (2014). We will return to this later in the survey.
4.2.2 The General Markov Chain Approach
In the preceding subsection, we briey surveyed the iterative random function approach
to establishing the existence of a stationary measure. Now we will look at this from a more
general Markov chain point of view. Roughly speaking there are two approaches; a topo-
logical Markov chain approach and a measure theoretic Markov chain approach. The latter
generally gives stronger results, but it is based on an irreducibility assumption which is not
fullled in general for the models (4.1), (4.2), but which is made to be fullled in the pertur-
bation approach. Markov theory, as applied to time series, has largely been dominated by
the irreducibility approach, but Tweedie (1988) showed that irreducibility can be avoided
at a cost. The integer time series theory has given a new impetus to proving existence of
84 Handbook of Discrete-Valued Time Series
a stationary measure using the rst approach, and we will look at this in Sections 4.2.2
through 4.2.5, and then use of irreducibility and perturbation in Section 4.2.6 and 4.2.7.
A common condition for both approaches is a form of stability condition. It can be
phrased in many ways. We take it from Meyn and Tweedie (2009). In fact a substan-
tial part of our material is based on their classical book. For a Markov process {X
t
}
with transition probability P(x, dy), the drift operator is dened for any nonnegative
measurable V by
V(x) =
P(x, dy)V(y) V(x). (4.12)
S
Stability conditions in terms of the drift operator can be formulated in many ways, for
example
V(x) ≤−1 + b1
C
(x), (4.13)
where b > 0 is a constant and 1
C
(·) is the indicator function of the measurable set C.
This entails a strict drift towards the set C.
Geometric drift towards C is stronger. Then there exists a function V : S →[1, ∞] and a
measurable set C such that
V(x) ≤−βV(x) + b1
C
(x), x S (4.14)
where β > 0, b < .
The weak Feller property is an important Markov chain property. As can be seen from
Theorem 4.1 mentioned later, it can be used as an instrument in establishing the existence
of a stationary measure, but not uniqueness. It states that if g is a continuous bounded func-
tion, then E( g(X
t+1
)|X
t
= x) is a continous bounded function of x. This can be formulated
for a locally compact state space S. Then we have the following result.
Theorem 4.1 (Meyn and Tweedie (2009), Theorem 12.3.4): If {X
t
} is weakly Feller, the drift con-
dition (4.13) holds with C a compact set, and if there exists an x
0
such that V(x
0
)<, then there
exists an invariant measure for {X
t
}.
All of the conditions of Theorem 4.1 are weak in the context we are considering. For
the linear Poisson system (4.4), (4.5) it is easily seen that if 0 < a + b < 1, then the drift
condition (4.13) and the condition of existence of an x
0
such that V(x
0
)< are satised
by choosing V(λ) =|λ|+1 = λ +1, and by choosing the support of the compact set C large
enough. Indeed, for verifying the weak Feller property, consider a continuous bounded
function g and let h(x) = E(g(λ
t
)|λ
t1
= x) = E(g(d + ax + bN
t1
(x)). The boundedness
85 Count Time Series with Observation-Driven Autoregressive Parameter Dynamics
of h follows trivially. Consider h(x) h(y) and let A be the event that N
t
hasnojumpsin
(y η, y + η), where η is to be chosen later. Then P(A) = e
2η
,and
h(x) h(y) = E[(g(d + ax + bN
t1
(x)) g(d + ay + bN
t1
(y))]1
A
+ E[g(d + ax + bN
t1
(x)) g(d + ay + bN
t1
(y))]1
A
c
=
.
I + II
On A
c
we have
|II|≤ 2||g||
P(A
c
) = 2||g||
(1 e
2η
)
where ||g||
= sup
x
|g(x)|. This expression can be made arbitrarily small by choosing η
small enough. On A the mapping x g(d + ax + bN
t1
(x)) is continuous if x is close to y,
and by Lebesgue dominated convergence |I|→ 0as y x, the proof can be completed by
standard arguments. In the nonlinear situation where
λ
t
= d + f
1
(λ
t1
) + f
2
(N
t1
(λ
t1
)) (4.15)
and with f
1
and f
2
positive, monotone and continuous such that for y < x, f
1
(x) f
1
(y)
a(x y), f
2
(x) f
2
(y) b(x y), then if a + b < 1, the stability condition (4.13) and the weak
Feller property are both satised, so that an invariant measure exists.
The disadvantage of Theorem 4.1 is that uniqueness is not obtained in general. The
strong Feller property, where a bounded measurable function g is mapped into a contin-
uous bounded function h(x) = E(g(X
t+1
)|X
t
= x) is sufcient to obtain uniqueness, but
unfortunately, the Markov chains generated by the models we look at are not strong Feller,
so an alternative way has to be found.
There are in fact several ways of doing this. One route goes via the so-called e-chains. The
e stands for equicontinuity, and an e-chain is a strengthening of the weak Feller property
from continuity to equicontinuity. On the other hand, it is sufcient to look at functions g
having a compact support, and such that for a given ε > 0, there exists a δ > 0 implying
|E(g(X
t+1
)|X
t
= x) E(g(X
t+1
)|X
t
= y)| < ε
for all t 1 whenever |x y| < δ. We also need the concept of boundedness in probability
on average: First, a sequence of probability measures {P
k
, k N } is tight if for each ε > 0,
there is a compact set C S such that lim inf
k→∞
P
k
(C) 1 ε. The chain {X
t
} will be said
to be bounded in probability on average if for each initial condition x S, the sequence
{
P
¯
k
(x; ·), k N } is tight, where
1
k
P
¯
k
(x; ·) = P
t
(x; ·).
k
t=1
In other words
1
k
lim inf
P(X
t
C|X
0
= x) 1 ε.
k→∞
k
t=1
86 Handbook of Discrete-Valued Time Series
Finally, a point x
0
S is called reachable if for every open set O B(S), the collection of
Borel sets in S, containing x
0
(i.e. for every neighborhood of x
0
)
P
i
(O|y)>0; y S.
i
The chain {X
t
} is said to be open set irreducible if every point is reachable, this being the
topological analogy of measure theoretic irreducibility. In the statement of the following
theorem we do not need open set irreducibility. It is sufcient that there is one reachable
state.
Theorem 4.2 (Meyn and Tweedie (2009) Theorem 18.0.2 (ii)): Assume that {X
t
} is achainona
topological space such that the assumptions of Theorem 4.1 holds and such that in addition
(i) There exists a reachable state
(ii) {X
t
} is an e-chain.
(iii) {X
t
} is bounded in probability on average
Then the invariant probability measure of Theorem 4.1 is unique.
It is possible to state this theorem with weaker conditions; see Meyn and Tweedie (2009)
Theorem 18.0.2.
Continuing with the example used to illustrate Theorem 4.1, consider again the lin-
ear model (4.4), (4.5). That there is a reachable state for this model is trivial to prove. In
+
fact every state in R
1
can be proved to be reachable; see Fokianos et al. (2009). Bound-
edness in probability on average follows from boundedness in probability, which in
turn follows from Chebyshev’s inequality (cf. Neumann 2011, eq. (2.5) and the proof of
his Theorem 2.1), where
t1
E(λ
t
|x)
(a + b)
i
d + (a + b)
t
x.
i=1
Proving the e-chain property requires a little more work. The proof is by recursion and the
reader is referred to Wang et al. (2014). Their proof is for a more general threshold-type
model.
For the nonlinear system (4.15) and with the conditions stated after the proof of the weak
Feller property, it can be proved using essentially the same method that it is an e-chain
bounded in probability on average, and that there exists a reachable state. Hence, there
exists a unique stationary measure for such a system.
Woodard et al. (2011) also have as their point of departure a general Markov chain.
Instead of using the e-chain, they use a condition termed “asymptotically strong Feller.”
This condition is fairly technical, and since we are not using it henceforth, we refer to their
paper for a precise formulation. If in addition a reachable point exists, then uniqueness of
the stationary distribution is obtained. Douc et al. (2013) use the asymptotic strong Feller
property.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset