Showing posts with label Time Complexity. Show all posts
Showing posts with label Time Complexity. Show all posts

How To Calculate Time Complexity of an Algorithm.

We calculate the Time and Space complexity of any algorithm by finding the growth rate of the function for different values of N where N is the size of our input. In most cases, we check an algorithm for a bigger value of N to get the worst-case time complexity of an Algorithm. 


It might be possible that one algorithm is giving the best results for smaller values of N and for the same problem another algorithm is giving the best results for larger values of N so we can't predict after which value of N one algorithm is better than other algorithm and this is the reason why we need to check for different values of N. 


But isn't it a tedious task to plug in different values of N to find the growth rate of any program? Here we use our Asymptotic Notations (more specifically Big O notation) to describe the growth rate or time complexity of our programs. 

Before going further into this topic to learn time complexity calculation, we will highly recommend you go through our Algorithms Introduction and Analysis post once. It will help you understand the terminologies used in this post. (alert-passed)  


What is Big O Notation?

When we want to find the running time of an algorithm we are not interested in calculating the exact running time because it is not possible to calculate the exact running time of each algorithm. We always calculate the approximate running time and Big O Notation helps us to achieve this. 


Big O Notation helps us find the time complexity in terms of the size of the input and the best part is that it gives us the upper bound of the function that tells us about the worse case time complexity of an algorithm because the function can never grow faster than this upper bound.


Condition for Big O Notation:

If f(n) and g(n) are the two functions, then

f(n) = O(g(n)) if there exists constants c and no such that f(n)  c.g(n), for all n  no.

Example: 
Given f(n) = n
Is f(n) = O(g(n))?

Let g(n) = 2n

f(n)  c.g(n)
   n  c.2n for all c = 1 and n. = 1

Yes, f(n) = O(g(n))
     f(n) = O(2n)  O(n) Linear Growth


Below is the growth rate of some Standard Functions.

Time complexity growth rate table


Let's understand the time complexity calculation with some example programs.

Example: Program to calculate the sum of first N natural numbers (using a loop).

//C++ Code to sum n natural numbers
#include<iostream>
using namespace std;

int main(){

    int sum = 0, n;              // ---> 1 time
    cin>>n;                      // ---> 1 time 

    for(int i = 1; i <= n; i++){ 
        sum = sum + i;           // ---> n times 
    }

    cout<<sum;                  // ---> 1 time 

    return 0;                   // ---> 1 time 
}
Output:
5
15

Calculation of Time complexity:

Total number of instruction = f(n) = n + 4

f(n) = n + 4
Let g(n) = n

f(n)  c.g(n)
n+4  c.n
n  4

Is f(n) = O(g(n)) ?
Yes, f(n)  c.g(n) for all n  no
where c = 2 and no = 4
f(n) = O(n) Linear Time Complexity

Example: Program to calculate the sum of first N natural numbers (using a formula).

//C++ Code to sum n natural numbers
#include<iostream>
using namespace std;

int main(){

    int sum = 0, n;              // ---> 1 time
    cin>>n;                      // ---> 1 time 

    sum = n*(n+1)/2;            // ---> 1 time
    cout<<sum;                  // ---> 1 time 

    return 0;                   // ---> 1 time 
}
Output:
5
15

Calculation of Time complexity:

Total number of instruction = f(n) = 5

f(n) = 5
Let g(n) = 1

f(n)  c.g(n)
5  c.1
Take c = 6
5  6

Is f(n) = O(g(n)) ?
Yes, f(n)  c.g(n) for all n  no
where c = 6 and no = 1
f(n) = O(1) Constant Time Complexity

In the above two examples, we have solved the same problem using two different methods to make you understand that multiple algorithms are present to solve one particular problem. Here the first approach is giving us time complexity O(n) and the second approach is giving us time complexity O(1) so we will use the second solution as constant time complexity is much faster than linear time complexity. 

What is the time complexity of the below function?
void fun(int n){
  int i, j;
  for(i = 1; i <= n/3; i++){    //Loop 1
     for(j = 1; j <= n; j+=4){  //Loop 2
         cout<<"Hello";
     }
  }
}
Answer:
Calculation of Time complexity:

For Loop 1:Loop is running from i = 1 to i = n/3 and incremented by one
unit each time.

Iteration 1: i = 1
Iteration 2: i = 2
Iteration 3: i = 3
          .
          .
          .
Iteration k: i = n/3 = k O(n)
Time Complexity = O(n)

For Loop 2: Loop is running from j = 1 to j = n and incremented by four
unit each time.

Iteration 1: j = 1
Iteration 2: j = 1 + 4
Iteration 3: j = 1 + 4 + 4 = 1 + 2*4
          .
          .
          .
Iteration k: j = n = 1 + (k - 1)*4
      n = 1 + (k - 1)*4
    n-1 = (k - 1)*4
(n-1)/4 = (k -1)
(n-1)/4 + 1 = k O(n)
Time Complexity = O(n)      

Total Time Complexity = O(n) x O(n) = O(n^2)

Time Complexities of Common Programming Conditions:

Loops: Normal loops usually execute for n number of times where n is the size of input present.
for(int i = 0; i < n; i++){
    //statements
}
  • The above loop executes n times.
  • Time Complexity: O(n)


Nested Loops: One loop run inside another loop and we can multiply the running time of both loops to get the total time.

for(int i = 0; i < n; i++){
    for(int j = 0; j < n; j++)
        //statements
}
  • The outer loop executes n times and the inner loop also executes n times.
  • Time Complexity: O(nxn) = O(n^2)

Consecutive statements: Series of statements and conditions are present inside a function and to get the total running time of the function we sum the time taken by each individual condition.
int n = 10;                // 1 time
int sum = 0;               //1 time

for(int i = 1; i <= n; i++){
   //statements           //n times
}

for(int i = 0; i < n; i++){
    for(int j = 0; j < n; j++)
        //statements       //n^2 times
}
  • The first two lines of code will take constant time to execute, the single for loop statement will execute n times and the nested for loops will execute n^2 times. 
  • Time Complexity: O(n^2 + n + 2) = O(n^2)

If-then-else statement: Usually taken constant time but if nested loops are present then we have to include their running time as well.
if(n == 0){               // ---> 1 time    
  //statement            // ---> 1 time
}
else{
   for(int i = 1; i <= n; i++){
       //statement       // ---> n times
   }
}
  • For if part it will take constant time (1 + 1 = O(1)) and for else part it will take constant time for condition checking and statement will execute n times (1 + n = O(n)).
  • Time Complexity: O(1) + O(n) = O(n)

Logarithmic Complexity: Logarithmic time complexity is achieved when the problem size is cut down by a fraction. log2(n) - Logarithmic equation tells us how many times 2 has been multiplied by itself in order to obtain value n.
//Example of Logarithmic growth
while(i <= n){
   //statement
   i = i*2;
}
Logarithmic Complexity

  • Time Complexity: O(log2n) 
I hope you found this post useful, please share your valuable feedback in the comment section below and share the post with other so we can keep on providing more such valuable content.(alert-success)

Algorithms Introduction and Analysis

Algorithms Introduction and Analysis

An Algorithm is a step-by-step procedure that helps us in solving computational problems. Using the right algorithm to solve a particular kind of problem increases the efficiency of code in terms of space and time complexity.
Space Complexity: Space complexity is the amount of memory space the algorithm is required for giving the desired output. The memory space is consumed by the input data, temporary data, and output data of the algorithm. (alert-success)
Time Complexity: Time complexity is the amount of time required by an algorithm for the execution and providing the desired output. It depends on the number of iterative or recursive steps involve in the execution of the algorithm.
(alert-success)
Sometimes, many get confused between the term Algorithm and Program. An algorithm is a designing part of solving a problem and the Program is the implementation part of solving that problem.  

We can write an Algorithm in any language like the English language or by using some mathematical notations. But the program for that algorithm must be written using programming languages like C, C++, Java, or Python.

How To Analyse an Algorithm?

You can find more than one algorithms for solving a particular problem after that you have to analyze them and use the most efficient one. 

The analysis of an algorithm is done base on its efficiency. The two important terms used for the analysis of an algorithm are “Priori Analysis” and “Posterior Analysis”.

Priori Analysis: It is done before the actual implementation of the algorithm when the algorithm is written in the general theoretical language. In this, the efficiency of the algorithm is calculated base on its complexity. It is just an approximate analysis. (alert-passed)  
Posterior Analysis: It is done after the actual implementation and execution of the algorithm using any programming language like C, C++, Java, or Python. It is the actual analysis in which the space and time complexity of the algorithm is calculated more correctly. (alert-passed) 

Analysis of Algorithms Based on Space and Time Complexity. 

The complexity of any algorithm is based upon two important factors, space complexity, and time complexity but it also depends on many other factors like which language you are using, which compiler, or which machine you are using for the implementation of the algorithm so measuring the exact time is very difficult.

Therefore, the time complexity of an algorithm is not measured in time units like seconds or microseconds. It depends on the size of the input, the bigger the size of the input more the time an algorithm will take.

We always analyze the algorithm for bigger input because sometimes it seems similar for smaller input. If the size of the input is n, then f(n) which is a function of n denotes the time complexity.

For example:
The time complexity of a program is given by a function f(n) = 2n^2+2n+1. If we analyze the function for different inputs (n=1, n=10, n=100, n=1000), we can observe the n^2 term as more dominant over other terms in the given equation so we can ignore the smaller terms like n or any constant because it is difficult to calculate the exact function for time complexity and in the asymptotic term we can say that the time complexity for the given function is O(n^2).
This method of measuring the approximate complexity is known as asymptotic complexity. (alert-success) 

Now, I hope you understand the above example and the term asymptotic but even if you are unable to get it completely don't worry as we are going to cover this in more detail soon. 

First, we need to learn more about the types of asymptotic notations. There are three asymptotic notations that are used to calculate the space and time complexity of an algorithm. They are Big OBig â„¦, and Big Î˜. But every time we are not going to calculate all three for an algorithm.  

Types of Asymptotic notation.

We use these three notations to express the complexity of an algorithm in three different cases: 

Big O is going to say the worst-case time or upper bound which means, if we are giving any time complexity in terms of Big O, it says that in any case, your time will not exceed this.

Big â„¦ is going to say the best case time or lower bound which means, in any case, you can’t achieve better than this. In this notation, we analyze the minimum number of steps that need to be executed. 

Big Θ is going to say the average case time, somewhere in between the worst and best. In this notation, we calculate the average by considering different types of inputs including the best case and the worst case.

In practice, we are not interested in what is the best time the algorithm could execute. We are always interested in what is the worst-case time that will be taken by this algorithm for any input.

Lower Bound < Average Bound < Upper Bound

We use the average case when the best case and worst case are the same and when both cases are taking the same time.

Example: Assume an array data structure of length n.
Time Complexity of and Array
We want to search for an element using linear search, if the element we want to search is 5, then in the best case we can search it in 1 comparison. In 1 step we can find it out.

=> â„¦(1) time complexity and it says that you can never go down beyond this.

Now if we want to search 3 out of n elements in the array then we need to compare all the elements in the worst-case O(n) and it says that you can never go beyond this complexity. 

C++ Example Code for Linear Search.

//Linear Search Algorithm code in C++
#include <iostream>
using namespace std;

int linearSearch(int arr[], int n, int target) {
    for (int i = 0; i < n; i++) {
        if (arr[i] == target) {
            return i; // return the index of the target element if found
        }
    }
    return -1; // return -1 if target element is not found
}

int main() {
    int arr[] = {5, 2, 8, 3, 1, 9};
    int n = sizeof(arr) / sizeof(arr[0]);
    int target = 3;

    int index = linearSearch(arr, n, target);
    if (index != -1) {
        cout << "Element found at index: " << index << endl;
    } else {
        cout << "Element not found." << endl;
    }

    return 0;
}
Output:
Element found at index: 3

In the above code, we have implemented the linear search algorithm to find the target element in the given array. The algorithm iterates through each element of the array and compares it with the target element. If a match is found, the index is returned; otherwise, -1 is returned to indicate that the element was not found.

Now I hope you get some idea of how we can calculate the best, worst, and average time complexity of an algorithm. We are going to practice a lot of questions to master this concept in our upcoming articles. 

Related Post:
👉Support Us by Sharing our post with others if you really found this useful and also share your valuable feedback in the comment section below so we can work on them and provide you best ðŸ˜Š.(alert-success) 

DON'T MISS

Tech News
© all rights reserved
made with by AlgoLesson