44.6k views
5 votes
Show that if Limx→[infinity]​f(x)=1, then ∫a[infinity]​f(x)dx diverges Given: limx→[infinity]​f(x)=1 Lest f(x)=x1​+1 What deremene Nis for f→1 ∫a[infinity]​f(x)dx=∫a[infinity]​(x1​+1)dx∫a[infinity]​f(x)dx=(Lnx+x)a[infinity]​​ D∫a[infinity]​f(x)dx=[infinity] D Hence ∫a[infinity]​f(x)dx divergers.

User Elachell
by
7.3k points

1 Answer

3 votes

Final answer:

To show that if Limx→∞f(x)=1, then ∫a÷∞f(x)dx diverges, you can use the limit comparison test.

Step-by-step explanation:

To show that if Limx→∞f(x)=1, then ∫a÷∞f(x)dx diverges, you can use the limit comparison test. First, rewrite the integral as ∫a÷∞(x-1+1)dx. Next, take the limit of the quotient f(x)/g(x) as x→∞, where g(x) = x-1. If the limit is a positive finite number, then both integrals converge or both integrals diverge. Since the limit is equal to 1, the integral ∫a÷∞(x-1+1)dx diverges.

User Kyril
by
7.4k points