We calculate the Time and Space complexity of any algorithm by finding the growth rate of the function for different values of N where N is the size of our input. In most cases, we check an algorithm for a bigger value of N to get the worst-case time complexity of an Algorithm.
It might be possible that one algorithm is giving the best results for smaller values of N and for the same problem another algorithm is giving the best results for larger values of N so we can't predict after which value of N one algorithm is better than other algorithm and this is the reason why we need to check for different values of N.
But isn't it a tedious task to plug in different values of N to find the growth rate of any program? Here we use our Asymptotic Notations (more specifically Big O notation) to describe the growth rate or time complexity of our programs.
Before going further into this topic to learn time complexity calculation, we will highly recommend you go through our Algorithms Introduction and Analysis post once. It will help you understand the terminologies used in this post. (alert-passed)
What is Big O Notation?
When we want to find the running time of an algorithm we are not interested in calculating the exact running time because it is not possible to calculate the exact running time of each algorithm. We always calculate the approximate running time and Big O Notation helps us to achieve this.
Big O Notation helps us find the time complexity in terms of the size of the input and the best part is that it gives us the upper bound of the function that tells us about the worse case time complexity of an algorithm because the function can never grow faster than this upper bound.
Condition for Big O Notation:
If f(n) and g(n) are the two functions, then
f(n) = O(g(n)) if there exists constants c and no such that f(n) ≤ c.g(n), for all n ≥ no.
Example: Given f(n) = n Is f(n) = O(g(n))? Let g(n) = 2n f(n) ≤ c.g(n) n ≤ c.2n for all c = 1 and n. = 1 Yes, f(n) = O(g(n)) f(n) = O(2n) ≈ O(n) Linear Growth
Below is the growth rate of some Standard Functions.
Let's understand the time complexity calculation with some example programs.
Example: Program to calculate the sum of first N natural numbers (using a loop).
//C++ Code to sum n natural numbers #include<iostream> using namespace std; int main(){ int sum = 0, n; // ---> 1 time cin>>n; // ---> 1 time for(int i = 1; i <= n; i++){ sum = sum + i; // ---> n times } cout<<sum; // ---> 1 time return 0; // ---> 1 time }
5 15
Calculation of Time complexity:
Total number of instruction = f(n) = n + 4 f(n) = n + 4 Let g(n) = n f(n) ≤ c.g(n) n+4 ≤ c.n n ≥ 4 Is f(n) = O(g(n)) ? Yes, f(n) ≤ c.g(n) for all n ≥ no where c = 2 and no = 4
f(n) = O(n) Linear Time Complexity
Example: Program to calculate the sum of first N natural numbers (using a formula).
//C++ Code to sum n natural numbers #include<iostream> using namespace std; int main(){ int sum = 0, n; // ---> 1 time cin>>n; // ---> 1 time sum = n*(n+1)/2; // ---> 1 time cout<<sum; // ---> 1 time return 0; // ---> 1 time }
5 15
Calculation of Time complexity: Total number of instruction = f(n) = 5 f(n) = 5 Let g(n) = 1 f(n) ≤ c.g(n) 5 ≤ c.1 Take c = 6 5 ≤ 6 Is f(n) = O(g(n)) ? Yes, f(n) ≤ c.g(n) for all n ≥ no where c = 6 and no = 1
f(n) = O(1) Constant Time Complexity
void fun(int n){ int i, j; for(i = 1; i <= n/3; i++){ //Loop 1 for(j = 1; j <= n; j+=4){ //Loop 2 cout<<"Hello"; } } }
Calculation of Time complexity: For Loop 1:Loop is running from i = 1 to i = n/3 and incremented by one
unit each time.
Iteration 1: i = 1
Iteration 2: i = 2
Iteration 3: i = 3
.
.
.
Iteration k: i = n/3 = k ≈ O(n)
Time Complexity = O(n)
For Loop 2: Loop is running from j = 1 to j = n and incremented by four
unit each time.
Iteration 1: j = 1
Iteration 2: j = 1 + 4
Iteration 3: j = 1 + 4 + 4 = 1 + 2*4
.
.
.
Iteration k: j = n = 1 + (k - 1)*4
n = 1 + (k - 1)*4
n-1 = (k - 1)*4
(n-1)/4 = (k -1)
(n-1)/4 + 1 = k ≈ O(n)
Time Complexity = O(n)
Total Time Complexity = O(n) x O(n) = O(n^2)
Time Complexities of Common Programming Conditions:
for(int i = 0; i < n; i++){ //statements }
- The above loop executes n times.
- Time Complexity: O(n)
Nested Loops: One loop run inside another loop and we can multiply the running time of both loops to get the total time.
for(int i = 0; i < n; i++){ for(int j = 0; j < n; j++) //statements }
- The outer loop executes n times and the inner loop also executes n times.
- Time Complexity: O(nxn) = O(n^2)
int n = 10; // 1 time int sum = 0; //1 time for(int i = 1; i <= n; i++){ //statements //n times } for(int i = 0; i < n; i++){ for(int j = 0; j < n; j++) //statements //n^2 times }
- The first two lines of code will take constant time to execute, the single for loop statement will execute n times and the nested for loops will execute n^2 times.
- Time Complexity: O(n^2 + n + 2) = O(n^2)
if(n == 0){ // ---> 1 time
//statement // ---> 1 time } else{ for(int i = 1; i <= n; i++){ //statement // ---> n times } }
- For if part it will take constant time (1 + 1 = O(1)) and for else part it will take constant time for condition checking and statement will execute n times (1 + n = O(n)).
- Time Complexity: O(1) + O(n) = O(n)
//Example of Logarithmic growth while(i <= n){ //statement i = i*2; }
Hi Probin, Thank you so much for such a valuable content on time complexity. Please add more example problems it will be more helpful :)
ReplyDelete