Time complexity Analysis

Time complexity Analysis

Time Complexity | Asymptotic Analysis | Recurrence Relations | Master's Method

What does one mean by Time Complexity?

Time complexity is the amount of time taken by an algorithm to run, as a function of the length of the input. It measures the time taken to execute each statement of code in an algorithm.

Before looking into time complexity further, first let us understand asymptotic notations

What is asymptotic analysis?

Asymptotic analysis refers to computing the running time of any operation in mathematical units of computation.

Example:

  • If the running time of an operation is computed as f(n) it means running time will increase linearly with the increase in n

  • If the running time of an operation is computed as g(n²) it means running time will increase square times n with the increase in n

Usually, the time required by an algorithm falls under three types −

  1. Best Case − Minimum time required for program execution
  2. Average Case − Average time required for program execution
  3. Worst Case − Maximum time required for program execution

Common asymptotic notations

  • Ο Notation
  • Ω Notation
  • θ Notation

Big Oh Notation, Ο -

The notation Ο(n) represents the upper bound of an algorithm's running time. It measures the worst case time complexity or the longest amount of time an algorithm can possibly take to complete.

image.png

Example:

(1). 3n+2=O(n)
⇒ As 3n+2 ≤ 4n for all n≥2

(2). 5n³+2=O(n³)
⇒ As 5n³+2 ≤ 6n³ for all n≥2

Omega Notation, Ω -

The notation Ω(n) represents the lower bound of an algorithm's running time. It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.

image.png

Example:

(1). n³+4n²=Ω(n²)
⇒ As n² ≤ n³ for all n≥1
⇒ n³ ≤ n³+4n² for all n≥0
⇒ n² ≤ n³ ≤ n³+4n² for all n≥1
⇒ n² ≤ n³+4n² for all n≥1
Hence n³+4n²=Ω(n²)

Theta Notation, θ -

The notation θ(n) represents both the lower bound and the upper bound of an algorithm's running time.

θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) and g(n) = Ω(f(n)) for all n > n0 }

image.png

Example:

(1). n²+5n+7=θ(n²)
⇒ As 0 ≤ n² ≤ n²+5n+7 ≤ 13n² for all n≥1

Common Asymptotic Notations

  • Constant − Ο(1)
  • Logarithmic − Ο(log n)
  • Linear − Ο(n)
  • n log n − Ο(n log n)
  • Quadratic − Ο(n²)
  • Cubic − Ο(n³)
  • Polynomial − nᴼ⁽¹⁾
  • Exponential − 2ᴼ⁽ⁿ⁾

Understanding Time Complexity

So now let us start understanding what time complexity is and how it is calculated.

Consider the following code and calculate its time complexity

image.png

Only one statement - print hello world is to be executed here which takes unit time. Hence, the time complexity is constant i.e O(1)

Pretty simple right.

Now consider this code and calculate its time complexity

image.png

Hello world will be printed 10 times which is also a constant. Hence, the time complexity here too is O(1)

Ok now let us solve the time complexity for this one

image.png

As in the above code, the hello world will print N times. As it is dependent on N here the time complexity will be O(N) here.

Now try this one

image.png

The running time of the two loops is proportional to the square of N. Hence the time complexity of the above code is O(N²)

Give this a try

image.png

The time complexity of this algorithm is O(log n)

Wait do you get that? Let me solve

As you can see i keeps doubling till it is less than n. The number of times we can double a number till it is less than n would be log n.

Still confused? Ok let me help

For any n
1st iteration, i = 2
2nd iteration, i = 4
3rd iteration, i = 8
4th iteration, i = 16
....
kth iteration, i = 2ᵏ

As 2ᵏ < n
Taking log both sides
k = log n

Hence the time complexity of this algorithm is O(log n)

Try this now

image.png

In this problem
When i=0 the inner for loop runs for 0 time
When i=1 the inner for loop runs for 1 time
When i=2 the inner for loop runs for 2 times
....
and so on till n-1

So
0+1+2+3+.............+n-1 = n(n-1)/2

Hence the time complexity of this algorithm is O(N²)

Having fun till now?
Now let us take some standard examples and understand its time complexity

Bubble Sort -

image.png

The time complexity of bubble sort is O(N²) as you can see there are two for loops and you could try to calculate.

Quick Sort -

image.png

In quick sort, we find a pivot element and place it at its correct position in such a way that the part of array before this position contains elements smaller than this value and the other part of the array contains elements larger than this value.

Now same computation is done for both parts of the array recursively. The time complexity of the partition function (to place the pivot element at its correct position) is O(N).

image.png

In the best case, the pivot is chosen close to the median i.e correct position somewhat close to the middle of the array and hence the array is divided into two equal parts.
And hence time complexity is around O(N log N)

If you could not understand the time complexity of quick sort here, wait we will look at the time complexity of recursive relations later.

Sorting Algorithms Time complexity

Bubble sort - O(N²)
Selection sort - O(N²)
Insertion sort - O(N²)

Quick sort - O(N log N)
Heap sort - O(N log N)
Merge sort - O(N log N)

Counting sort - O(N)
Radix sort - O(N)
Bucket sort - O(N)

Some other basic examples on Time complexity

  • Add two numbers - O(1)
  • Find maximum in an array - O(N)
  • Linear search - O(N)
  • Binary search - O(log N)
  • Find all subsets of a set - O(2ᴺ)
  • Find all permutations of a given set - O(N!)
  • Find solutions of a cubic equation - O(N³)
  • Reverse a number - O(N)
  • 3D Matrix multiplication - O(N³)

, etc.

Calculating time complexity for non-recursive algorithms is pretty straightforward. Now we will look at how to calculate time complexity for recursive algorithms.

Recurrence relation

A recurrence is an equation or inequality that describes a function in terms of its values on smaller inputs.

Example :
nth term of a fibonacci sequence is represented as
fib(n) = fib(n-1) + fib(n-2)

Let us try to calculate recurrence relations for some algorithms.

Find the recurrence relation for this algorithm

image.png

T(0) = O(1)
T(n) = T(n-1) + O(1)

This is the recurrence relation for this algorithm.

Find recurrence relation for the following logic to find nth term of a Fibonacci sequence using recursion

image.png

The recurrence relation for this is:
T(n) = O(1) if n≤1
T(n) = T(n−1) + T(n−2) + O(1) otherwise

Now find recurrence relation for merge sort

image.png

Here each “divide” step yields two sub-problems of size n/2.
The merge() function to merge two arrays takes O(N) time.

T(n) = a + T(n/2) + T(n/2) + bn
T(n) = 2T(n/2) + (a + bn)
where a and b are constants

Simplified recurrence relation
T(n) = 2T(n/2) + θ(n)

You might have become familiar to how to calculate recurrence relations for any recursive algorithms as of now.

Methods to solve Recurrence relations

  • Substitution Method
  • Iterative Method
  • Recursive Tree Method
  • Master's Method

Substitution Method

We make a guess for the solution and then we use mathematical induction to prove the guess is correct or incorrect

Example:

image.png

Iterative Method

It means to expand the recurrence and express it as a summation of terms of n and initial condition

Example:

image.png

Recursive Tree Method

In this method, we convert the recurrence into a tree and then we sum the costs of all the levels of the tree

Example:

image.png

image.png

Master's Method

Master's method is used to solve recurrence relations of the form
T(n) = aT(n/b) + f(n)
where a≥1

There are three cases -

image.png

Examples:

image.png

image.png

As in the above example 4 you can see that the Master method fails.
What do we do now??

Well, we have extended Master's method too.

Extended Master's Method -

This is an advanced version of the Master's theorem and used to solve recurrence relations of the form
T(n) = aT(n/b) + θ(nᵏlogᵖn)
where a≥1, b≥1, k≥0 and p∈R

There are three cases for this theorem -

image.png

We can put the values of a, b, k and p and solve time complexity for any problem now.

Conclusion

I think you might now have felt confident to calculate time complexity of any algorithm.

All the best

This is the end of the blog. Give a like and follow if you liked it.