1.8k views
2 votes
This exercise derives the normalization constant of Beta(a, ẞ) in the case of integer parameters alpha = s + 1, beta = n - s + 1 by exploring the connection between Bayesian inference for a Bernoulli parameter using uniform prior, and • Order statistics of a uniform RV. Let p in [0, 1] be the parameter of a Bernoulli i(p) distribution (e.g. the probability of Heads for a coin). Suppose we have no prior information about p. In the Bayesian approach, we model our ignorance by considering the parameter p as a uniformly distributed random variable p~ Uniform([0, 1]). We can then model the observations X_{1}, X_{2} ,***,X n in the following way: let U_{1}, U_{2} ,***,U n be i.i.d.~ Uniform([0, 1]) that are independent from p, and define if U; ≤ P, if Ui > p. X₁ = 1U≤p: = Reason that conditioned on the value of Conclude that p, X_{1}, X_{2} ,***,X n are i.i.d.~ Bernoulli (p). mathbb P (X 1 =x 1 ,***,X n =x n |p)= prod i = 1 to n p ^ x_{i} * (1 - p) ^ (1 - x_{i}) = p ^ s * (1 - p) ^ (n - s) where s:= x_{1} + x_{2} +***+x n (ii) Deduce that mathbb P (X 1 +X 2 +***+X n =s|p)= binomial(n,s) * p ^ s * (1 - p) ^ (n - s) . and mathbb P (X 1 +X 2 +***+X n =s)= integrate binomial(n,s) * p ^ s * (1 - p) ^ (n - s) dp from 0 to 1 . (Hint: for the second equation, use Law of Total Probability.)

User Ed King
by
8.3k points

1 Answer

3 votes

Final answer:

The exercise establishes a connection between Bayesian inference for a Bernoulli parameter with a uniform prior and order statistics of a uniform random variable. By considering the parameter p as a uniformly distributed random variable p ~ Uniform([0, 1]), and defining observations X₁, X₂,...,Xₙ based on uniform random variables and the parameter p, it is deduced that conditioned on the value of p, X₁, X₂,...,Xₙ are independent and identically distributed (i.i.d.) as Bernoulli(p). Further, it's deduced that the probability of the sum of X₁ to Xₙ equalling 's' given p follows a binomial distribution.

Step-by-step explanation:

This exercise delves into Bayesian inference concerning a Bernoulli parameter with a uniform prior and its relationship with order statistics of a uniform random variable. It introduces the concept of modeling ignorance about a parameter by treating it as a uniformly distributed random variable, p ~ Uniform([0, 1]).

This method involves defining observations X₁, X₂,...,Xₙ based on independent, identically distributed (i.i.d.) uniform random variables U₁, U₂,...,Uₙ, independent of p. These observations are derived using the conditions set by the uniform random variables and the parameter p, establishing their independence and identical distribution as Bernoulli(p).

The subsequent deductions lead to important probabilities regarding the sum of observations X₁ to Xₙ given a specific value of p. It's deduced that the probability distribution of the sum of these observations, equalling 's' given p, follows a binomial distribution.

Moreover, by employing the Law of Total Probability, the exercise provides a formula for the probability of the sum of X₁ to Xₙ equalling 's' without conditioning on a specific value of p. This approach forms a crucial linkage between Bayesian inference and order statistics, allowing for a deeper understanding of probability distributions under certain conditions.

User Remonia
by
8.2k points