10.6 Exercises on Chapter 10

1. Show that in importance sampling the choice

Unnumbered Display Equation

minimizes  even in cases where f(x) is not of constant sign.
2. Suppose that has a Cauchy distribution. It is easily shown that , but we will consider Monte Carlo methods of evaluating this probability.
a. Show that if k is the number of values taken from a random sample of size n with a Cauchy distribution, then k/n is an estimate with variance 0.125 802 7/n.
b. Let p(x)=2/x2, so that  . Show that if  is uniformly distributed over the unit interval then y=2/x has the density p(x) and that all values of y satisfy  and hence that

Unnumbered Display Equation

gives an estimate of  by importance sampling.
c. Deduce that if  are independent U(0, 1) variates then

Unnumbered Display Equation

gives an estimate of  .
d. Check that  is an unbiased estimate of  and show that

Unnumbered Display Equation

and deduce that

Unnumbered Display Equation

so that this estimator has a notably smaller variance than the estimate considered in (a).
3. Apply sampling importance re-sampling starting from random variables uniformly distributed over (0, 1) to estimate the mean and variance of a beta distribution Be(2, 3).
4. Use the sample found in Section 10.5 to find a 90% HDR for Be(2, 3) and compare the resultant limits with the values found using the methodology of Section 3.1. Why do the values differ?
5. Apply the methodology used in the numerical example in Section 10.2 to the data set used in both Exercise 16 on Chapter 2 and Exercise 5 on Chapter 9.
6. Find the Kullback–Leibler divergence  when p is a binomial distribution  and q is a binomial distribution  . When does   ?
7. Find the Kullback–Leibler divergence  when p is a normal distribution  and q is a normal distribution  .
8. Let p be the density  (x> 0) of the modulus x=|z| of a standard normal variate z and let q be the density  (x> 0) of an  distribution. Find the value of  such that q is as close an approximation to p as possible in the sense that the Kullback–Leibler divergence  is a minimum.
9. The paper by Corduneanu and Bishop (2001) referred to in Section 10.3 can be found on the web at

Unnumbered Display Equation

Härdle’s data set is available in  by going data(faithful). Fill in the details of the analysis of a mixture of multivariate normals given in that section.
10. Carry out the calculations in Section 10.4 for the genetic linkage data quoted by Smith which was given in Exercise 3 on Chapter 9.
11. A group of n students sit two exams. Exam one is on history and exam two is on chemistry. Let xi and yi denote the ith student’s score in the history and chemistry exams, respectively. The following linear regression model is proposed for the relationship between the two exam scores:

Unnumbered Display Equation

where  .
Assume that  and  and that α,  and  are unknown parameters to be estimated.
Describe a reversible jump MCMC algorithm including discussion of the acceptance probability, to move between the four competing models:
1.  ;
2.  ;
3.  ;
4.  .
Note that if z is a random variable with probability density function f given by

Unnumbered Display Equation

then  [due to P. Neal].

1. Often denoted DKL(q||p) or KL(q||p).

2. Those with a background in statistical physics sometimes refer to  as the (negative) variational free energy because it can be expressed as an ‘energy’

Unnumbered Display Equation

plus the entropy

Unnumbered Display Equation

but it is not necessary to know about the reasons for this.

3. In that subsection, we wrote S where we will now write SS, we wrote  where we will now write  , and we wrote θ0 where we will now write  .

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset