The video link doesn't work for me, but I assume this is the problem involving choosing between two payment types, one that immediately grants you some lump sum of cash, while the other starts off with $0.01 on the first day, $0.02 on the second day, $0.04 on the third, and so on, doubling per day.
Let

be the amount of money earned on the

-th day. Then

is a geometric sequence that satisfies

We can solve explicitly for

in terms of the starting payment

:





So the amount of money earned on the

-th day is

Denote the total amount of money earned over

days by

. Then we can write

but since we have equivalent expressions for

on any given day

, this is the same as


Let's call the sum on the right hand side

. Notice that





which means we end up with

To answer part (D), you need to look no further than the formula for

. The question is basically asking what happens as

gets arbitrarily large. It should be clear that

grows without bound, so as

, the amount of money you would get would diverge to infinity. So technically, you cannot calculate the amount because (1) there's only a finite amount of money to go around, and (2) infinity is not a "computable" number.
If, however, the scale factor used on your income was smaller than 1 - for example, say you were started with a million dollars on the first day, then your income got halved each day - then

would eventually converge to 0, and on top of that,

would converge to a finite number.