55.6k views
4 votes
True or False: An infinite geometric series diverges if the absolute value of the common ratio (|r|) is greater than 1.

User Alexu
by
7.6k points

1 Answer

4 votes

Final answer:

The statement is true: an infinite geometric series diverges if the |r| is greater than 1, as terms get larger, preventing a finite sum.

Step-by-step explanation:

The statement is true: an infinite geometric series diverges if the absolute value of the common ratio (|r|) is greater than 1. If |r| > 1, the terms in the sequence become increasingly larger with each step as you multiply by a factor greater than 1.

For a geometric series to converge, the common ratio must satisfy the condition |r| < 1, as only then the terms get progressively smaller and add up to a finite sum.

User FireSnake
by
8.1k points