154k views
2 votes
3. (15 marks) Conditional on X,Y is a Bern(X) random variable. Marginally, X is a Beta(α,β) random variable where both α>0 and β>0 are constants. Note that E[Y∣X]=X and Var(Y∣X)=X(1−X) (a) Determine the marginal probability mass function (PMF) of Y. [5 marks] (b) Determine the conditional probability density function (PDF) of X given Y. [5 marks] (c) Conditional on X,(Y1​,…,Yn​) are independent and identically distributed from a Bern(X) distribution. Again X is a Beta(α,β) random variable where both α>0 and β>0 are constants. Determine the conditional probability density function (PDF) of X given Y1​,…,Yn​ [5 marks]

User FutureCake
by
7.8k points

1 Answer

3 votes

Final Answer:

(a) Marginal Probability Mass Function (PMF) of Y:


\[ P(Y=y) = (\Gamma(\alpha+\beta))/(\Gamma(\alpha)\Gamma(\beta)) \cdot y^(\alpha-1) \cdot (1-y)^(\beta-1) \]

(b) Conditional Probability Density Function (PDF) of X given Y:


\[ f_(X|Y)(x|y) = (1)/(B(\alpha,\beta)) \cdot x^(\alpha y - 1) \cdot (1-x)^(\beta(1-y) - 1) \]

(c) Conditional Probability Density Function (PDF) of X given
\(Y_1, \ldots, Y_n\):


\[ f_(X|Y_1,\ldots,Y_n)(x|y_1,\ldots,y_n) = \frac{1}{B(\alpha+n\bar{y},\beta+n(1-\bar{y}))} \cdot x^{\alpha+n\bar{y}-1} \cdot (1-x)^{\beta+n(1-\bar{y})-1} \]

Step-by-step explanation:

(a) Marginal Probability Mass Function (PMF) of Y:

The marginal PMF of Y is derived from the integration of the conditional distribution
\(Y|X\), which is a Bernoulli distribution with probability X, over the marginal distribution of X, which is a Beta distribution with parameters
\(\alpha\) and \(\beta\). The formula for the marginal PMF of Y is given by:


\[ P(Y=y) = (\Gamma(\alpha+\beta))/(\Gamma(\alpha)\Gamma(\beta)) \cdot y^(\alpha-1) \cdot (1-y)^(\beta-1) \]

Here,
\(\Gamma\) represents the gamma function, and the expression combines the probabilities from the Beta distribution with the probabilities of the Bernoulli distribution.

(b) Conditional Probability Density Function (PDF) of X given Y:

To find the conditional PDF of X given Y, we use Bayes' theorem. Given that
\(E[Y|X]=X\) and \(Var(Y|X)=X(1-X)\), the conditional distribution is a Bernoulli distribution. Applying Bayes' theorem, the conditional PDF is expressed as:


\[ f_(X|Y)(x|y) = (1)/(B(\alpha,\beta)) \cdot x^(\alpha y - 1) \cdot (1-x)^(\beta(1-y) - 1) \]

This formula incorporates the information about the mean and variance of Y given X and utilizes the Beta function
\(B(\alpha,\beta)\) to normalize the distribution.

(c) Conditional Probability Density Function (PDF) of X given
\(Y_1, \ldots, Y_n\):

Extending the conditional distribution to n independent and identically distributed Bernoulli random variables, the conditional PDF of X given
\(Y_1, \ldots, Y_n\) is obtained. The parameters of the Beta distribution are updated based on the sample mean
\(\bar{y}\) and the number of observations n. The formula is given by:


\[ f_(X|Y_1,\ldots,Y_n)(x|y_1,\ldots,y_n) = \frac{1}{B(\alpha+n\bar{y},\beta+n(1-\bar{y}))} \cdot x^{\alpha+n\bar{y}-1} \cdot (1-x)^{\beta+n(1-\bar{y})-1} \]

This expression reflects the cumulative information from the observed Bernoulli variables, providing a updated probability distribution for X given the observed values of
\(Y_1, \ldots, Y_n\).

User TResponse
by
8.1k points