30.9k views
2 votes
Show that if X is a geometric random variable with parameter p, then

E[1/X]= −p log(p)/(1−p)
Hint: You will need to evaluate an expression of the form
i=1➝[infinity]∑(ai/ i)
To do so, write
ai/ i=0➝a∫(xi−1) dx then interchange the sum and the integral.

1 Answer

7 votes

Answer:


\sum_(k=1)^(\infty) (p(1-p)^(k-1))/(k)=-(p ln p)/(1-p)

Explanation:

The geometric distribution represents "the number of failures before you get a success in a series of Bernoulli trials. This discrete probability distribution is represented by the probability density function:"


P(X=x)=(1-p)^(x-1) p

Let X the random variable that measures the number os trials until the first success, we know that X follows this distribution:


X\sim Geo (1-p)

In order to find the expected value E(1/X) we need to find this sum:


E(X)=\sum_(k=1)^(\infty) (p(1-p)^(k-1))/(k)

Lets consider the following series:


\sum_(k=1)^(\infty) b^(k-1)

And let's assume that this series is a power series with b a number between (0,1). If we apply integration of this series we have this:


\int_(0)^b \sum_(k=1)^(\infty) r^(k-1)=\sum_(k=1)^(\infty) \int_(0)^b r^(k-1) dt=\sum_(k=1)^(\infty) (b^k)/(k) (a)

On the last step we assume that
0\leq r\leq b and
\sum_(k=1)^(\infty) r^(k-1)=(1)/(1-r), then the integral on the left part of equation (a) would be 1. And we have:


\int_(0)^b (1)/(1-r)dr=-ln(1-b)

And for the next step we have:


\sum_(k=1)^(\infty) (b^(k-1))/(k)=(1)/(b)\sum_(k=1)^(\infty)(b^k)/(k)=-(ln(1-b))/(b)

And with this we have the requiered proof.

And since
b=1-p <1 we have that:


\sum_(k=1)^(\infty) (p(1-p)^(k-1))/(k)=-(p ln p)/(1-p)

User DSC
by
7.0k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.