How To Calculate Time Complexity of an Algorithm.

We calculate the Time and Space complexity of any algorithm by finding the growth rate of the function for different values of N where N is the size of our input. In most cases, we check an algorithm for a bigger value of N to get the worst-case time complexity of an Algorithm. 


It might be possible that one algorithm is giving the best results for smaller values of N and for the same problem another algorithm is giving the best results for larger values of N so we can't predict after which value of N one algorithm is better than other algorithm and this is the reason why we need to check for different values of N. 


But isn't it a tedious task to plug in different values of N to find the growth rate of any program? Here we use our Asymptotic Notations (more specifically Big O notation) to describe the growth rate or time complexity of our programs. 

Before going further into this topic to learn time complexity calculation, we will highly recommend you go through our Algorithms Introduction and Analysis post once. It will help you understand the terminologies used in this post. (alert-passed)  


What is Big O Notation?

When we want to find the running time of an algorithm we are not interested in calculating the exact running time because it is not possible to calculate the exact running time of each algorithm. We always calculate the approximate running time and Big O Notation helps us to achieve this. 


Big O Notation helps us find the time complexity in terms of the size of the input and the best part is that it gives us the upper bound of the function that tells us about the worse case time complexity of an algorithm because the function can never grow faster than this upper bound.


Condition for Big O Notation:

If f(n) and g(n) are the two functions, then

f(n) = O(g(n)) if there exists constants c and no such that f(n)  c.g(n), for all n  no.

Example: 
Given f(n) = n
Is f(n) = O(g(n))?

Let g(n) = 2n

f(n)  c.g(n)
   n  c.2n for all c = 1 and n. = 1

Yes, f(n) = O(g(n))
     f(n) = O(2n)  O(n) Linear Growth


Below is the growth rate of some Standard Functions.

Time complexity growth rate table


Let's understand the time complexity calculation with some example programs.

Example: Program to calculate the sum of first N natural numbers (using a loop).

//C++ Code to sum n natural numbers
#include<iostream>
using namespace std;

int main(){

    int sum = 0, n;              // ---> 1 time
    cin>>n;                      // ---> 1 time 

    for(int i = 1; i <= n; i++){ 
        sum = sum + i;           // ---> n times 
    }

    cout<<sum;                  // ---> 1 time 

    return 0;                   // ---> 1 time 
}
Output:
5
15

Calculation of Time complexity:

Total number of instruction = f(n) = n + 4

f(n) = n + 4
Let g(n) = n

f(n)  c.g(n)
n+4  c.n
n  4

Is f(n) = O(g(n)) ?
Yes, f(n)  c.g(n) for all n  no
where c = 2 and no = 4
f(n) = O(n) Linear Time Complexity

Example: Program to calculate the sum of first N natural numbers (using a formula).

//C++ Code to sum n natural numbers
#include<iostream>
using namespace std;

int main(){

    int sum = 0, n;              // ---> 1 time
    cin>>n;                      // ---> 1 time 

    sum = n*(n+1)/2;            // ---> 1 time
    cout<<sum;                  // ---> 1 time 

    return 0;                   // ---> 1 time 
}
Output:
5
15

Calculation of Time complexity:

Total number of instruction = f(n) = 5

f(n) = 5
Let g(n) = 1

f(n)  c.g(n)
5  c.1
Take c = 6
5  6

Is f(n) = O(g(n)) ?
Yes, f(n)  c.g(n) for all n  no
where c = 6 and no = 1
f(n) = O(1) Constant Time Complexity

In the above two examples, we have solved the same problem using two different methods to make you understand that multiple algorithms are present to solve one particular problem. Here the first approach is giving us time complexity O(n) and the second approach is giving us time complexity O(1) so we will use the second solution as constant time complexity is much faster than linear time complexity. 

What is the time complexity of the below function?
void fun(int n){
  int i, j;
  for(i = 1; i <= n/3; i++){    //Loop 1
     for(j = 1; j <= n; j+=4){  //Loop 2
         cout<<"Hello";
     }
  }
}
Answer:
Calculation of Time complexity:

For Loop 1:Loop is running from i = 1 to i = n/3 and incremented by one
unit each time.

Iteration 1: i = 1
Iteration 2: i = 2
Iteration 3: i = 3
          .
          .
          .
Iteration k: i = n/3 = k O(n)
Time Complexity = O(n)

For Loop 2: Loop is running from j = 1 to j = n and incremented by four
unit each time.

Iteration 1: j = 1
Iteration 2: j = 1 + 4
Iteration 3: j = 1 + 4 + 4 = 1 + 2*4
          .
          .
          .
Iteration k: j = n = 1 + (k - 1)*4
      n = 1 + (k - 1)*4
    n-1 = (k - 1)*4
(n-1)/4 = (k -1)
(n-1)/4 + 1 = k O(n)
Time Complexity = O(n)      

Total Time Complexity = O(n) x O(n) = O(n^2)

Time Complexities of Common Programming Conditions:

Loops: Normal loops usually execute for n number of times where n is the size of input present.
for(int i = 0; i < n; i++){
    //statements
}
  • The above loop executes n times.
  • Time Complexity: O(n)


Nested Loops: One loop run inside another loop and we can multiply the running time of both loops to get the total time.

for(int i = 0; i < n; i++){
    for(int j = 0; j < n; j++)
        //statements
}
  • The outer loop executes n times and the inner loop also executes n times.
  • Time Complexity: O(nxn) = O(n^2)

Consecutive statements: Series of statements and conditions are present inside a function and to get the total running time of the function we sum the time taken by each individual condition.
int n = 10;                // 1 time
int sum = 0;               //1 time

for(int i = 1; i <= n; i++){
   //statements           //n times
}

for(int i = 0; i < n; i++){
    for(int j = 0; j < n; j++)
        //statements       //n^2 times
}
  • The first two lines of code will take constant time to execute, the single for loop statement will execute n times and the nested for loops will execute n^2 times. 
  • Time Complexity: O(n^2 + n + 2) = O(n^2)

If-then-else statement: Usually taken constant time but if nested loops are present then we have to include their running time as well.
if(n == 0){               // ---> 1 time    
  //statement            // ---> 1 time
}
else{
   for(int i = 1; i <= n; i++){
       //statement       // ---> n times
   }
}
  • For if part it will take constant time (1 + 1 = O(1)) and for else part it will take constant time for condition checking and statement will execute n times (1 + n = O(n)).
  • Time Complexity: O(1) + O(n) = O(n)

Logarithmic Complexity: Logarithmic time complexity is achieved when the problem size is cut down by a fraction. log2(n) - Logarithmic equation tells us how many times 2 has been multiplied by itself in order to obtain value n.
//Example of Logarithmic growth
while(i <= n){
   //statement
   i = i*2;
}
Logarithmic Complexity

  • Time Complexity: O(log2n) 
I hope you found this post useful, please share your valuable feedback in the comment section below and share the post with other so we can keep on providing more such valuable content.(alert-success)

⚡ Please share your valuable feedback and suggestion in the comment section below or you can send us an email on our offical email id ✉ algolesson@gmail.com. You can also support our work by buying a cup of coffee ☕ for us.

Similar Posts

1 comment:

  1. Hi Probin, Thank you so much for such a valuable content on time complexity. Please add more example problems it will be more helpful :)

    ReplyDelete


CLOSE ADS
CLOSE ADS