Final answer:
The statement is true: an infinite geometric series diverges if the |r| is greater than 1, as terms get larger, preventing a finite sum.
Step-by-step explanation:
The statement is true: an infinite geometric series diverges if the absolute value of the common ratio (|r|) is greater than 1. If |r| > 1, the terms in the sequence become increasingly larger with each step as you multiply by a factor greater than 1.
For a geometric series to converge, the common ratio must satisfy the condition |r| < 1, as only then the terms get progressively smaller and add up to a finite sum.