## Asymptotic Notations – I

In this post, I will discuss the time functions in asymptotic notations. You probably have heard about asymptotic notations. They are the Big-Oh, Big-Omega, and Theta functions. I had read in details about them when I was in doing my B. Tech [I couldn’t recollect which semester, and that is a pity].

It is going to be a [little] in-depth analysis, so if you are not interested, and you have a fair idea of what these functions are, you can skip this post.

When we talk about the time complexity of an algorithm, of course, we do not find the exact time the algorithm is going to take to complete the execution. Mostly because it depends on a lot of other factors, for example, the capacity of the environment in which the code runs. What we are interested in, is to find the growth rate of the execution time.

When we study the order of growth rate of the algorithms with large input sizes, i.e. the behaviour of the algorithms’ running time as the input size increases. We represent this using asymptotic notations.

In an ideal scenario, we should be interested in algorithms which are asymptotically more efficient.

There are three functions:

1. Big-Oh (written as capital O)
2. Big-Omega (written as Ω)
3. Theta (written as Θ)

In simple words,
big-oh represents the upper bound,
big-omega represents the lower bound, and
theta represents the average bound.

To understand these functions and their meaning, we’ll take an example and walk through it.

Problem statement:
In a sorted array, whose first element is 0 and the last element is 99, find the index of a random input number.

[Of course, the index will be the same as the number, but just for the illustration purposes, let’s write the algorithm for this problem.]

``````Arr = [0, 1, 2, 3, ....., 98, 99]

1.  Algorithm FindIndex (Arr, Inp)
2.  begin
3.    for (index = 0; index < Arr.length; index++) {
4.      if (Arr[index] == Inp) {
5.        print index;             // match found
6.        break;
7.      }
8.    }
9.  end``````

Let’s calculate the exact time function, quickly [to find out how, please check this post].

Roughly, it is going to be `f (n) = 2n + 3`.

Let us

Let us now jump back to the asymptotic functions, and the first one is:

Big-Oh – The Upper Bound

function f (n) belongs to big-oh of g (n) if there exist positive constants `c` and `n0` such that for all values of `n` greater than `n0`, the value of f (n) is greater than 0 and less than `c` times g (n).

Let’s work out with the equation we got in the above algorithm.

``````f (n) = 2n + 3

Substituting the value of f (n) into asymptotic equation,

0 <= 2n + 3 <= c g(n)

To satisfy the LHS equation, we can clearly see even if the value of n is 0, the condition will satisfy.

i.e. for n = 0,
0 <= 2 x 0 +3
0 <= 3 ............ true

Let's take the RHS now.

2n + 3 <= c.g (n)

In the first shot, let's put a random value for c and g (n),
c = 10
g (n) = n

So, the equation becomes,

2n + 3 <= 10n,

Again, we can clearly see that for n = 0, 1, 2, 3.... this equation will evaluate to true.

At this point, using the definition given above, we can say,

f (n) = Big-Oh of g (n)
= O (n)           // as g (n) is n while c is 10 in this case.``````

Bonus
Just in case if you are wondering, how to find out the minimum value of c then do the following,

``````in the following equation,

2n + 3 <= c.g (n)

replace c.g(n) with exactly same constants which are present on the LHS, except, this time, multiply each term with n (if it is not multiplied already).

i.e.
2n + 3 <= 2n + 3n
2n + 3 <= 5n

for n = 0,
3 <= 0       .............. false

for n = 1,
5 <= 5       .............. true

for n = 2,
7 <= 10.     .............. true

So,

c is 5,
g (n) is n, and
n0 is 1

Putting all of these in the definition, we can say,

f (n) is O (n) for c and n0 with values 5 and 1 respectively.
``````

In the next post, I will discuss the remaining two function with the same example we discussed today.