Answer:
See explaination
Step-by-step explanation:
Anytime we want to compute time complexity of an algorithm with input size n denoted as T(n).
And then check growth rate of that algorithm with any large number of input n.
We have to first obtain f(n) of that algorithm .
f(n) is total execution time of algorithm in terms of n.
we want write this function with growth rate notation.
we know there are basic three notations
1. big oh(O) notation
2.theta(Θ) notation
3. big omega(Ω) notation
we want explain big oh(O) notation
Here is the formal mathematical definition of Big O.
also called asymptotic upper bound
Let T(n) and f(n) are two positive functions. We write T(n) ∊ O(f(n)), if there are positive constants c and n₀ such that
T(n) ≤ c·f(n) for all n ≥ n₀.
This graph shows a situation where all of the conditions in the definition are exist. (see attachment for graph)
c.fin) T(n) cost no
we can say
T(n) ∊ O(f(n)) means that growth rate of T(n) is not faster than f(n).
example
T(n) = n -1. Using Big O notation this can be written as T(n) ∊ O(n).
let c = 1 and n₀ = 1,
then T(n) = n - 1 ≤ 1·n when n ≥ 1.