106 Handbook of Discrete-Valued Time Series
Proof: Observe that
e
λ
λ
k
e
λ
λ
j
E[min(M
1
, M
2
)]= min(k, j)
k! j!
k=0 j=0
k1
e
2λ
λ
k+j
e
2λ
λ
k+j
=
k + j
.
k!j! k!j!
k=1 j=k k=1 j=1
Putting and taking diagonal terms (when j = k in the double summation) gives
k
e
2λ
λ
k+j
e
2λ
λ
k+j
E[min(M
1
, M
2
)]= k
k!j!
+
j
k!j!
k=1 j=k k=1 j=1
e
2λ
λ
2k
k
.
k!k!
k=1
The two double sums mentioned earlier can be shown to be equal; hence,
e
2λ
λ
k+j
e
2λ
λ
2k
E[min(M
1
, M
2
)]=2 k
k!j!
k
k!k!
. (5.12)
k=1 j=k k=1
Dealing with the second summation in (5.12) rst, we have
ke
2λ
λ
2k
2λ
λ
2+2
= e
= λe
2λ
I
1
(2λ).
(k!)
2
!( + 1)!
k=1
=0
For the rst summation in (5.12),
ke
2λ
λ
k+j
e
2λ
λ
+j
2
= 2 = 2λ
P(M
1
= M
2
>).
k!j! !j!
k=1 j=k
=0
j=+1
=0
Since M
1
and M
2
are independent and identically distributed,
ke
2λ
λ
k+j
2
= λ
[P(M
1
= M
2
>) + P(M
1
> M
2
= )]
k!j!
k=1 j=k
=0
= λ
1
P(M
1
= M
2
= )
.
=0
107 Renewal-Based Count Time Series
Using P[M
1
= M
2
= ]=P[M
1
= ]
2
and the Poisson probabilities now gives
j
ke
2λ
λ
k+
2
k!j!
= λ[1 e
2λ
I
0
(2λ)]
k=1 j=k
and completes our work.
The autocorrelation function of {Y
t
} is similar in form to (5.6):
C(λ)
1
Corr(Y
t
, Y
t+h
) =
u
h
, h = 1, 2, ... . (5.13)
λ μ
Series with negative autocorrelations are easily constructed. To see this, again let L be the
lifetime supported on {1, 2, 3} with probabilities P[L = 1]=P[L = 3]=and P[L = 2]=
1 2for some small . Then μ = 2, u
1
= , and the random pair (Y
t
, Y
t+1
) has Poisson
marginal distributions with the same mean λ/2and
C(λ)
1
Corr(Y
t
, Y
t+1
) =
.
λ 2
Letting 0 shows existence of a random pair with Poisson marginals with the same mean
λ/2, whose correlation is arbitrarily close to C(λ)/(2λ). This is close to the most nega-
tive correlation possible. For recent updates on this problem and bounds, see Shin and
Pasupathy (2010) and Yahav and Shmueli (2012).
Aclassical renewal result (Smith, 1958; Feller, 1968) states that
0
|u
h
μ
1
| < if and
h=
only if E[L
2
] < . Since γ
Y
(h) is proportional to u
h
μ
1
(see (5.20) below), renewal series
will have long memory whenever E[L
2
]=∞(recall that E[L] < is presupposed). Long
memory count series are discussed further in Lund et al. (2015; Chapter 21 in this volume).
Figure 5.2 displays a sample path of length 500 of a count series with Poisson marginals and
its sample autocorrelations. This series was generated by taking {M
t
} in (5.10) as Poisson
with mean λ =20 and L as a shifted discrete Pareto variable: L = 1 + R, where R is the
Pareto random variate satisfying P(R =k) =ck
α
for k 1withα =2.5. Here, c is a constant,
depending on α, that makes the distribution sum to unity. Since L 2, one cannot have
a renewal at time 1 in a nondelayed process. Hence, u
1
= 0, u
1
μ
1
is negative, and
long memory features have been made in tandem with a negative lag one autocorrelation!
Negative autocorrelations at other lags can be devised with non-Pareto choices of L.
The earlier results are useful for multivariate Poisson generation, an active area of
research (Karlis and Ntzoufras, 2003). Generating a random pair (X, Y),with X and Y each
having a Poisson marginal distribution with the same mean, but with negative correlation,
is nontrivial. In fact, suppose that M is a deterministic positive integer and
M M
X =
B
i
, Y = (1 B
i
), (5.14)
i=1 i=1
where {B
i
}
M
are IID Bernoulli random variables with the same success probability
i=1
p =1/2. Then X and Y have binomial distributions with M trials and the same mean M/2;
moreover, Corr(X, Y) =−1. However, should one replace the deterministic M by a Poisson
108 Handbook of Discrete-Valued Time Series
A realization of a Poisson series
Count
15
10
5
0
0 100 200 300 400 500
Time
−0.4
0.0
0.4
0.8
Autocorrelation
0 5 10 15 20 25
Lag
FIGURE 5.2
A realization of length 500 of a long memory stationary time series with Poisson marginal distributions with mean
6.895 (to three decimal places). Sample autocorrelations are also shown with pointwise 95% critical bounds for
white noise. Notice that negative correlations can be achieved.
random M with mean λ, then both X and Y have Poisson distributions with mean λ/2,
but now Corr(X, Y) =0. While there are Poisson partitioning interpretations of the previ-
ous (Ross, 1996), the point is that generating negatively correlated Poisson variates can
be tricky.
5.2.3 Geometric Marginals
Stationary geometric count series can also be constructed from the renewal class. For con-
creteness, we clarify geometric and zero-modied geometric distributions. A geometric
random variable Z takes values in {1, 2, ...}, with probabilities P(Z = k) = p(1 p)
k1
for
some p (0, 1); the zero-modied geometric distribution has P(Z = k) = p(1 p)
k
on the
support set k ∈{0, 1, 2, ...}, which includes zero. To construct stationary count series {Y
t
}
with geometric marginal distributions with success probability p and generating function
E
r
Y
t
=
1 (
pr
1 p)r
, (5.15)
let us attempt to write Y
t
as the superposition
M
t
Y
t
= 1 + X
i,t
, (5.16)
i=1
109 Renewal-Based Count Time Series
where {M
t
}
1
is an IID sequence of random variables to be determined. Computing E r
Y
t
t=
via (5.16) provides
E
r
Y
t
= rψ
M
1 +
r 1
, (5.17)
μ
where ψ
M
(r) = E r
M
t
. If such a representation is possible, (5.15) and (5.17) must agree:
p
r 1
= ψ
M
1 +
.
1 (1 p)r μ
Replacing 1 + (r 1)/μ by z and simplifying gives ψ
M
(z) = p/[p + μ(1 z)]. Such a ψ
M
(·)
is a legitimate generating function. In fact, ψ
M
(·) is the probability generating function of
a zero-modied geometric random variate with success probability α = p/[p + (1 p)μ].
The moment structure of {Y
t
} is E[Y
t
]≡1/p and
γ
Y
(h) =
E[min(M
t
, M
t+h
)]
u
h
1
, h = 1, 2, ... .
μ μ
When h = 0, γ
Y
(0) = (1 p)/p
2
. Evaluating E[min(M
t
, M
t+h
)] is easier than the analogous
Poisson computation: when h > 0,
E[min(M
t
, M
t+h
)]= P[min(M
t
, M
t+h
)>]= P[M
t
>]
2
.
=0 =0
Using P[M
t
> k]=(1 α)
k+1
now gives E[min(M
t
, M
t+h
)]=(1 α)
2
/[1 (1 α)
2
].
Again, negatively correlated series at any lag h can be produced by letting L be such that
u
h
< μ
1
.Let L be the lifetime supported on {1, 2, 3} with P(L = 1) = P(L = 3) = and
P(L = 2) = 1 2. Then u
1
= and the correlation at lag 1 can be veried to converge to
p(1 p)/(4 3p) as 0.
Figure 5.3 shows a sample path of a geometric count series of length 1000 along with its
sample autocorrelations. Here, we have taken L = 1 + P, where P is Poisson with a unit
mean (E[L]=μ = 2). The chosen value of α is 1/3, so that p = 1/2. Again, note the negative
autocorrelations.
Finally, the tactics in (5.14) are not useful in generating negatively correlated geometric
variates. Elaborating, set
M M
X = 1 +
B
i
, Y = 1 + (1 B
i
),
i=1 i=1
where the individual Bernoulli trials {B
i
}
have mean E[B
i
]= 1/2 (this corresponds to
i=1
lifetimes with μ = 2 and implies that X and Y have the same distribution) and M has a zero
modied geometric distribution with success probability α = p/(2 p). From the above,
110 Handbook of Discrete-Valued Time Series
A realization of a Geometric series
Count
15
10
5
0
0 200 400 600 800 1000
Time
0.0
0.4
0.8
Autocorrelation
0 5 10 15 20 25
Lag
FIGURE 5.3
A realization of length 1000 of a stationary time series with Geometric marginal distributions with success prob-
ability 1/2. Sample autocorrelations are also shown along with pointwise 95% critical bounds for white noise.
Notice that negative correlations can be achieved.
we know that both X and Y have the same geometric distribution with the success
probability p. However, a simple computation gives
σ
2
1 p
M
μ
M
Corr(X, Y) = =
,
σ
2
M
+ μ
M
1 + p
which is positive for any p (0, 1).
5.2.4 Generalities
Since any nonnegative integer-valued random variable can be built from Bernoulli trials,
any count marginal distribution can be achieved from renewal methods. It may be surpris-
ing how easily this is done. Consider trying to build a prespecied marginal distribution
for Y
t
supported on {c, c + 1, ...} with a renewal superposition of the form
M
t
Y
t
= c + X
t,i
. (5.18)
i=1
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset