Final answer:
To check if f(n) = o(g(n)), determine the limit of f(n)/g(n) as n approaches infinity. If the limit is 0, then f(n) = o(g(n)). If not, then f(n) is not equal to o(g(n)).
Step-by-step explanation:
The notation f(n) = o(g(n)) represents that f(n) is strictly smaller than g(n) as n approaches infinity. To check if f(n) = o(g(n)), we need to determine the limit of f(n)/g(n) as n approaches infinity. If the limit is 0, then f(n) = o(g(n)). If the limit is not 0 or does not exist, then f(n) is not equal to o(g(n)).
For example, let's say f(n) = 2n+1 and g(n) = n^2. To find the limit of f(n)/g(n) as n approaches infinity, we divide the functions: (2n+1) / (n^2).
As n approaches infinity, the numerator grows linearly while the denominator grows quadratically. Therefore, the limit of the fraction is 0. Thus, f(n) = o(g(n)).