111 Renewal-Based Count Time Series
Here, {M
t
} is an IID sequence supported on {0, 1, 2, ...} and c is some nonnegative integer
constant. For simplicity, we assume that c = 0. Dened as in (5.18), {Y
t
} is easily shown to
be a strictly stationary sequence.
Specifying the marginal distribution of Y
t
is equivalent to specifying the probability
generating function ψ
Y
(r) = E
r
Y
t
. Equation (5.18) gives
1
r
ψ
Y
(r) = ψ
M
1 +
, (5.19)
μ μ
where ψ
M
(r) = E
r
M
and M has the same distribution as M
t
for any t. Letting z = r/μ +
(1 1/μ) in (5.19) produces
ψ
M
(z) = ψ
Y
(
1 + μ(z 1)
)
.
To verify that ψ
M
(·) is a legitimate generating function of a random variable supported on
{0, 1, ...}, note that the kth derivative of the generating function at zero satises ψ
(k)
=
M
(0)
ψ
Y
(k)
(1 μ)μ
k
0for k = 0, 1, ..., which is nonnegative if ψ
Y
(k)
(1 μ) 0for k = 0, 1, 2, ....
Note that lim
z1
ψ
Y
(1 + μ(z 1)) = ψ
Y
(1) = 1since ψ
Y
(·) is a valid probability generating
function. Hence, many discrete non-negative integer-valued distributions can be built from
the renewal superposition in (5.18).
The autocovariance functions that can be built from (5.18) are less clear. Any solution to
(5.18) with c = 0 is stationary with mean E[Y
t
]≡E[M
1
]/μ and autocovariance
γ
Y
(h) =
E[min(M
1
, M
2
)]
u
h
1
, h = 1, 2, .... (5.20)
μ μ
The process variance is
γ
Y
(0) =
Var(M
t
)
+
E[M
t
]
1
1
.
μ
2
μ μ
Now suppose that {γ(h)}
is given and is known to be the autocovariance function of
h=0
some stationary series. We want to construct a renewal count process {Y
t
} in the form of
(5.18) having this autocovariance structure. Taking a power series transform gives
σ
2
μ
M
E[min(M
1
, M
2
)]
1
(z) :=
γ
Y
(h)z
h
=
M
+
1
μ
2
μ μ
h=0
+
E[min(M
1
, M
2
)]
u
h
μ
z
,
1 h
μ
h=0
where μ
M
E[M
t
] and σ
2
0
u
h
z
h
and recall the classical
M
Var(M
t
).Let U(z) =
h=
relationship U(z) = 1/[1 ψ
L
(z)], where ψ
L
(z) = E[z
L
] (Smith, 1958). Solving for ψ
L
gives
μ(z)
1
1
ψ
L
(z) = 1
D
, (5.21)
E[min(M
1
, M
2
)] μ(1 z)
112 Handbook of Discrete-Valued Time Series
where
σ
2
M
+ (μ 1)μ
M
1
D =
1
.
μE[min(M
1
, M
2
)] μ
Unfortunately, without further assumptions, ψ
L
(·) in (5.21) may not be a legitimate gener-
ating function of a lifetime L supported on {1, 2, ...}. To see an example of such, consider
the autocorrelation function
φ
|h/T|
, h = 0, ±T, ±2T, ...
ρ
Y
(h) =
0, otherwise,
where the parameter φ (1, 1) and T > 1 is some positive integer. This is the autoco-
variance of a causal Tth order stationary autoregression whose rst T 1 autoregressive
coefcients are zero.
To see that this autocovariance is not “buildable,” the zero autocorrelations at lags
1, 2, ..., T 1 imply that u
h
= μ
1
for h = 1, ..., T 1. Using this in (5.1) with an induction
reveals a geometric form over the rst T 1 support set values:
1
1
k1
P(L = k) =
1
, k = 1, 2, ..., T 1.
μ μ
Plugging these into (5.1) gives
1
1
T1
u
T
= P(L = T) +
1
1
. (5.22)
μ μ
At lag T, (5.20) requires that
1
u
T
μ
1
M
1
μ
σ
2
E[min(M
1
, M
2
)]
φ = ρ
Y
(T) =
. (5.23)
+ μ
M
μ
Combining (5.22) and (5.23) now gives
σ
2
M
1
1
+ μ
M
T1
1
1
1
μ
P(L = T) = φ
+
μ
. (5.24)
E[min(M
1
, M
2
)] μ μ
113 Renewal-Based Count Time Series
Using μ 1, a calculus maximization gives the bound μ
1
(1 μ
1
)
T1
T
1
.Nowdene
σ
2
M
+ μ
M
1
1
μ μ
κ
M
(μ) =
E[min(M
1
, M
2
)]
and note that κ
M
(μ)>0 for every μ 1. Our bounds provide
1
P(L = T) φκ
M
(μ) +
,
T
which is negative (a contradiction) when φ is chosen close enough to 1and T large.
To see that κ
M
(μ) is bounded away from zero, note that
min(σ
2
M
, μ
M
)
κ
M
(μ)>
,
E[min(M
1
, M
2
)]
which is strictly positive unless M
t
has zero variance. If Var(M
t
) 0, M
t
M where M is
a constant. A return to (5.24) now provides
P(L = T) = φ(1 μ
1
) + μ
1
(1 μ
1
)
T2
φ(1 μ
1
) + T
1
,
which again can be made negative by choosing T large and φ close to 1.
Of course, it is not clear how to build general stationary count series having with a
general prespecied autocovariance function and marginal distribution—such processes
may not even exist. Notwithstanding, the renewal class seems to be a very exible class of
stationary count time series models.
5.3 Multivariate Series
So far, we have largely concentrated on univariate models. In many applications, multiple
count series arise simultaneously and there is a need to develop count process models that
allow for correlation between components (cross-correlation). The natural generalization
of (5.10) to two dimensions (omitting any additive constants) is
Y
(1)
M
(
t
1)
X
(1)
t
i=1
t,i
Y
t
:=
Y
(2)
=
M
(2)
. (5.25)
t
X
(2)
t
j=1
t,j
Here, the superscripts (1) and (2) index components, M
t
=
M
t
(1)
, M
t
(2)
is a bivariate
nonnegative integer-valued pair and {X
t,i
}=
X
t
(
,
1
i
)
, X
(2)
is a two-dimensional binary
t,i
sequence formed by a bivariate renewal process as described below. We take {X
t,i
} inde-
pendent of {X
t,j
} when i = j. Here, X
(1)
is unity if and only if a renewal occurs at time t in
t,i
the rst coordinate of the ith renewal process (similarly for the second coordinate).
114 Handbook of Discrete-Valued Time Series
Clarifying further, a bivariate renewal process entails a two-dimensional version of
the univariate case: bivariate IID lifetimes L
i
=
L
i
(1)
, L
i
(2)
are added coordinatewise,
i=1
yielding the random walk
S
n
= L
0
+ L
1
+ L
2
+···+L
n
, n = 0, 1, ....
We say that a point occurs in the bivariate random walk at time t = (t
1
, t
2
)
when S
n
= t for
some n 0. The bivariate walk contains two univariate renewal processes. An important
distinction arises from one dimension. Suppose that a point occurs in the bivariate walk at
time (t
1
, t
2
)
. Then a renewal occurred at time t
1
in coordinate one and a renewal occurred
in coordinate two at time t
2
. However, the converse is not true in general. In fact, there
are many ways for a renewal to occur in coordinate one at time t
1
and a renewal to occur
in coordinate two at time t
2
without having a point in the bivariate walk at (t
1
, t
2
)
. For
example, if L
0
= (1, 1)
, L
1
= (3, 2)
,and L
2
= (2, 5)
, then a renewal occurs at time 6 in
coordinate one, a renewal occurs in coordinate two at time 3, but there is no point at (6, 3)
in the bivariate walk.
Correlation between L
j
(1)
and L
j
(2)
is allowed for a xed j; however, L
j
(1)
and L
k
(2)
are inde-
pendent when j = k. The notation L
(1)
and L
(2)
will denote generic component lifetimes
when the index i 1 is of no importance. As in one dimension, an initial lifetime L
0
exists
that makes the two-dimensional renewal sequence bivariate stationary. We will not concern
ourselves with the form of L
0
here for reasons that follow.
Arguing as in the univariate case, {Y
t
} in (5.25) can be shown to be a bivariate stationary
(strictly) time series of counts. For autocovariance notation, let γ
(
Y
1,1)
(h) = Cov
Y
(
t
1)
, Y
t
(
+
1)
h
and γ
(
Y
2,2)
(h) = Cov
Y
t
(2)
, Y
t
(
+
2)
h
denote coordinatewise lag-h covariances and γ
(
Y
1,2)
(h) =
Cov Y
t
(1)
, Y
(
t+
2)
h
denote the lag-h cross-covariances. The marginal autocovariances γ
(
Y
1,1)
(·)
and γ
Y
(2,2)
(·) take the form in (5.20):
γ
(
Y
1,1)
(h) =
C
(1)
u
(
h
1)
1
, γ
(
Y
2,2)
(h) =
C
(2)
u
(
h
2)
1
, h = 1, 2 ...
E[L
(1)
] E[L
(1)
] E[L
(2)
] E[L
(2)
]


Here, C
(1)
=E
min
M
(
t
1)
, M
(
t+
1)
h
and C
(2)
=E
min
M
(
t
2)
, M
(
t+
2)
h
for h > 0. A detailed
calculation shows that, for h = 0, 1, 2, ...
Cov
M
(1)
, M
(2)
γ
(
Y
1,2)
(h) =
μ
(1
t
)
μ
(2)
t+h

+ E
min
M
(
t
1)
, M
(
t+
2)
h
P
X
t
(
,1
1)
= 1 X
t
(
+
2)
h,1
= 1
μ
(1)
1
μ
(2)
,
where μ
(1)
= E
L
(1)
and μ
(2)
= E
L
(2)
. By strict stationarity of the bivariate renewal
sequence, P
X
t
(
,1
1)
= 1 X
t
(
+
2)
h,1
= 1
:=
h
does not depend on t.

115 Renewal-Based Count Time Series
1
An interesting issue now arises. Unless, Corr L
(1)
, L
(2)
1,
h
= μ
(1)
μ
(2)
for all h.
While we will not prove this here, the intuition is that at a large time t, the lifetimes in use
in coordinates one and two random walks will almost surely be different—and indepen-
dence is assumed between L
i
and L
j
when i = j. Elaborating, suppose that at time t in a
nondelayed setting L
(1)
= L
(2)
= 0 , the rst component is using the ith lifetime L
(
i
1)
and
the second component is using the jth lifetime L
(
j
2)
. Then when t is large, it is very unlikely
that i = j; in fact, in the limit as t →∞, i = j with probability one. The implication is that
the cross-covariance structure of the process reduces to
Cov M
(
t
1)
, M
(
t+
2)
h
γ
Y
(1,2)
(h) =
μ
(1)
μ
(2)
.
Hence, if {M
t
} is taken as IID as in the univariate case, γ
(1,2)
(h) = 0 when h 1.
Y
Here are some tactics to induce nonzero cross-correlation between components. The ear-
lier cross-covariance does not assume independence between M
(
t
1)
and M
(
t+
2)
h
. We hence
allow them to be dependent in special ways. One way simply links {M
t
} to a correlated
univariate count series {N
t
} (say generated by univariate renewal methods) via
N
t
M
t
=
.
N
t1
Then γ
(
Y
1,2)
(h) = γ
N
(h 1)/ μ
(1)
μ
(2)
and there can be nonzero correlation between com-
ponents. A second tactic for inducing cross-correlation is based on copulas. Suppose F
1
and F
2
are CDFs of the desired (prespecied) component marginal distributions. For a
Gaussian illustration, suppose that Z
t
=
Z
(
t
1)
, Z
(
t
2)
is multivariate normal with mean
0 and covariance matrix .Nowset
F
1
1
Z
(
t
1)
M
t
=

, (5.26)
F
1
Z
(2)
2
t
where (·) is the CDF of the standard normal random variable and F
i
1
(y) is the small-
est x such that P(X x) y for X distributed as F
i
, i =1, 2 (this denition of the inverse
CDF has many nice properties—see Theorem 25.6 in Billingsley 1995). The range of pos-
sible correlations and characteristics of the transform in (5.26) are discussed in Yahav and
Shmueli (2012). If {Z
t
} is IID, then {M
t
} is also IID and γ
(1,2)
(h) = 0for h = 1, 2, ....How-
Y
ever, γ
Y
(1,2)
(0) = 0 and there is nonzero cross-correlation between components of Y
t
(at lag
zero) in general. Nonzero cross-covariances can be obtained at lags h = 1, 2, ... by allow-
ing {Z
t
} to be a stationary bivariate Gaussian process. We will not attempt to derive the
autocovariance function of such a series here.
Figure 5.4 shows a realization of (5.25) of length n = 100 from Gaussian copula methods
that produce Poisson marginal distributions for the two components. Here, {M
t
}from (5.26)
was generated via a Gaussian copula, with component one of {Z
t
} having mean 12 and
component two having mean 20. The covariance matrix was selected to have ones on
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset