201k views
3 votes
Estimating the parameter of a geometric r.v.

We have k coins. The probability of Heads is the same for each coin and is the realized value q of a random variable Q that is uniformly distributed on [0,1]. We assume that conditioned on Q=q, all coin tosses are independent. Let Ti be the number of tosses of the ith coin until that coin results in Heads for the first time, for i=1,2,…,k. (Ti includes the toss that results in the first Heads.)

You may find the following integral useful: For any non-negative integers k and m,

∫01qk(1−q)mdq=k!m!(k+m+1)!.

Find the PMF of T1. (Express your answer in terms of t using standard notation.)

For t=1,2,…,

pT1(t)=

User Huu
by
8.1k points

1 Answer

4 votes

Final answer:

The PMF of T1, the number of coin tosses until a head appears, when the probability of heads is uniformly distributed, is calculated to be 1/(t+1) for t=1,2,... This formula assumes independence of tosses conditioned on a given value of Q=q.

Step-by-step explanation:

Calculating the PMF of T1 for a Geometric Random Variable with a Uniformly Distributed Probability

Given a geometric random variable T1, which represents the number of coin tosses until a head appears for the first time, we aim to find the probability mass function (PMF) of T1. Each coin has a probability of Heads represented by a random variable Q uniformly distributed over the interval [0,1]. When conditioned on Q=q, the coin tosses are independent.

The integral provided is helpful for calculating expectations and probabilities for functions of random variables distributed according to a beta distribution. To find the PMF of T1, we recognize that the probability of getting a head on the t-th toss (with t-1 tails before it) is an integral over the possible values of q. Specifically, the desired PMF of T1 at t is:

pT1(t) = ∫01 q0(1-q)t-1 dq = ⅑ for t=1,2,…

Using the given integral, we can calculate:

pT1(t) = ⅑ = t!*(0)!/(t+0+1)! = 1/(t+1)

Therefore, pT1(t) = 1/(t+1), which represents the PMF of T1 for a given t.

The concept of geometric distribution is crucial here because it describes the number of trials until the first success, and we assume independence given Q=q. Additionally, the uniform distribution of Q suggests that all probabilities are equally likely a priori.

User Brubs
by
8.0k points