168k views
4 votes
To show that g(x) grows slower than f(x), what is the result of the limit as x goes to infinity of the quotient of g of x and f of x

1 Answer

1 vote

There are several assumptions that we must do in order to answer this question.

First of all, we must assume that both functions grow to infinity. In fact, if
f(x) \to l_f and
g(x) \to l_g with
l_f,l_g < \infty, you simply have


\displaystyle \lim_(x \to \infty) (g(x))/(f(x)) = (l_g)/(l_f)

Secondly, we must assume that
g(x) grows asymptotically slower than
f(x). Otherwise, you might choose


g(x) = x,\ f(x) = x+1 \implies g(x) < f(x) \forall x

but you would have


\displaystyle \lim_(x \to \infty) (g(x))/(f(x)) = 1

If instead
g(x) is asymptotically slower than
f(x), by definition of being asymptotically slower you have


\displaystyle \lim_(x \to \infty) (g(x))/(f(x)) = 0

User Madu
by
5.9k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.