## Time & Space Complexity – III

In the previous post I talked about time complexity of simple for loops.

In the previous post, I talked about the time and space complexity of simple for loops. I hope you got the time and space complexity of the algorithm with 3 for loops as O(n^3) and O(n^2) respectively.

Let’s discuss some more examples, algorithms which have different conditions in for loop. As we already know how to find out the space complexity of for loops, I will not discuss it in this post. In this post, we are not worried about how much space our algorithm is going to occupy. We will be tweaking the conditions in for loop which would make it little difficult to predict the number of times the statement inside the loop will execute.

Let’s start with a simple one though.

```
// Algorithm 1
1. Algorithm ForLoop (n)
2. begin
3. for (i = 0; i < n; i = i+2) {
4. // statement
5. }
6. end
```

In all previous examples, we have concluded that the statement inside the loop will execute for N times. From now onwards, we will discard the (N+1) execution of for loop’s first line, and we will focus on code.

But in this case, we can see that the code at line 4 will not run for n times as the increment factor for the index is 2.

**Note:***Similar to this, the increment factor for the index can be of 20, 30 or any constant value, but the calculation of time factor will remain the same for all of them as eventually, we are going to discard the constants.*

The point is, the time complexity of all such algorithms, will be linear, and a function of degree 1. Therefore, we can say, as we saw in the first example of part 1 of this series, the time complexity of such algorithms will be or order n.

Let’s consider one more example.

```
// Algorithm 2
1. Algorithm ForLoop (N)
2. begin
3. p = 0
4. for (i = 1; p <= N; i++) {
5. p = p + i
6. }
7. end
```

Let’s consider one more example.

This one doesn’t look simple, as the break condition of this loop depends on the value of ‘p’, which is not a constant value. Let us

```
when the value of i is. line 4 will run for
====================== ====================
1 1
2 1 + 2
3 1 + 2 + 3
. .
. .
k 1 + 2 + 3 + .. + k
```

We can see that this code will not run for n times, due to it’s breaking condition. We are also adding ‘i ‘to ‘p’ in every iteration. So, it’d be fair to assume this loop will reach the break condition when the value of ‘i’ is less than ‘n’, and that value, we have assumed to be ‘k’.

```
So, when the loop will exit, the value of i will be k and
line 4 will run for 1 + 2 + 3 + .... + k times.
It'd be safe to assume,
p > n .. 1 // that's why the loop will break.
Also, the last value of p (as we can see in the table above is),
= 1 + 2 + 3 + ..... + k
= k (k + 1) / 2 // sum of arithmetic progression
i.e. p = (k^2 + k) / 2
or,
(k^2 + k) > n // replacing the value of p in 1 and discarding the constants
// we can also discard the k here as the degree 2 is greater than 1, which gives us,
k^2 > n
or,
k > sqrt (n)
```

And thus, we can say that the time complexity of this algorithm is a function of square root of n.

```
i.e.
f(n) = O (sqrt n)
```

## Leave a Reply