Department of Statistics, Yale University



Department of Statistics, Yale University

STAT242b Theory of Statistics

Suggested Solutions to Homework 6

Compiled by Marco Pistagnesi

Problem 11.2

a) Omitted. But just a suggestion: do not report actual numbers! Who cares of 100, 500, 1000 random numbers? Who is ever able to interpret them? Be sensible, report relevant information, use the function summary().

b) The prior distribution is [pic]. Then we consider the joint density function:

[pic]

Thus we find the posterior density (up to a constant):

[pic] (1).

We recognize the form of (2) as a distribution and thus by Bayesian inference we say that the posterior density has the distribution[pic].

Remark: Note from this example how Bayesian inference works, and make sure you understand the point. When we divide the joint (likelihood) by the marginal of the x’s, we actually end up just rearranging the joint, as the marginal is independent of the unknown parameter ( and so will not affect crucially its posterior distribution. So the whole point is to rearrange cleverly the joint, in a way such that by playing around with the constant term, we can re-express it as a density function for the parameter. Hence, notice how in (1) one starts from a joint density of the variable, and ends up with a density for the parameter. You need to be looking for a clever way to perform this switch. Hence the lesson is: with Bayes inference, do not perform blind calculation without thinking (especially of what the constant term will be), this will take you to horrible insensible expression. Always look through your formula and figure out the suitable constant that will lead you to a (on a test, typically familiar) distributional form for the posterior.

c) Easy, omitted.

d) Because we know that ( is distributed as a Normal distribution, we may immediately say that ( is distributed as a log-Normal distribution, with the same parameters as the Normal—that is to say that [pic]. Thus the posterior density for ( is given:

[pic].

e) We consider a mean-centered posterior interval [–a, a] on a standard normal:

[pic]

Thus we take the 95 percent posterior interval by calculating a:

[pic]

Now we consider “un-normalizing” the standard normal to match our distribution:

[pic]

hence, by solving for ( and plugging your sample mean for the estimate form your simulation, we get the interval.

f) We can use the result from e) to state:

[pic].

So that the sought CI is just [pic].

Remark: this easy question puzzled the majority of you (and me too at the beginning). Your struggle was to figure out the moments of the new variable [pic], which is lognormal. So some of you just computed mean and s.e. for the lognormal and used them as values for the CI. This is wrong, as the probability of the resulting interval is still computed by you referring to the standard normal values. Such your interval would not have probability 0.95. Not that it is a wrong interval, it is sensibly computed, it just is not at 95% significance level, as you are required to do.

Some others, used delta method to figure out the above moments. This instead will produce an approximate CI, since the delta method is an asymptotic one. This would again not be consistent with what you are required to do.

Problem 11.3

We begin by determining that the Uniform has density x/( when 0 ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download