Problem Statement:
The Sieve of Eratosthenes is a method for finding all primes up to (and possibly including) a given natural n
See you in the next article.
Until Then...
Byeeee
Problem Statement:
The Sieve of Eratosthenes is a method for finding all primes up to (and possibly including) a given natural n
See you in the next article.
Until Then...
Byeeee
Enter the marks of Joey in subject 1: 89 Enter the marks of Joey in subject 2: 88 Enter the marks of Joey in subject 3: 87 Enter the marks of Monica in subject 1: 78 Enter the marks of Monica in subject 2: 98 Enter the marks of Monica in subject 3: 99
Joey got [89, 88, 87] Monica got [78, 98, 99]
In the previous post, we have studied, Complexity Analysis of Common Loops,
As you go through this, the journey of understanding Data Structures, to make your code more efficient you will use the concept of Recursion.
When we analyze these functions, we get a recurrence relation for time
complexity. For example in Binary Search, to search a given element, we divide the array/list into two halves and recursively repeat the process for two halves, until we get the element we are searching for. The time complexity of
Binary Search is T(n) = T(n/2) + c i.e., Ө(log n). There are many such algorithms like Tower of Hanoi, Josephus Problem, and many more.
We have three methods, for calculating the Complexity of the Recursive
Function.|
1) Substitution methods
2) Tracing Tree Methods
3)
Master Theorem
First, let's take a simple example,
void Test(int n) { --- T(n) if (n < 0) { printf("%d", n); --- 1 Test(n-1); --- T(n-1) } }So, we have time Complexity function T(n) = T(n-1)+1
For analyzing, the time is taken or space allotted for the loops like
for and while there is the method called frequency count method. Look for the algorithm below...
Example 1:
Algorithm Sum(A, n){ s = 0; --> 1 unit for (i=0; i < n; i++){ --> n+1 unit s = s + A[i]; --> n unit } return s; --> 1 unit } Total=> 1+n+1+n+1 = 2n+3
The time taken by the algorithm can be known by assigning one unit of time for each statement and if any statement is repeating for some number of times. The frequency of that statement or frequency for executing that statement will be calculated and we get the time taken for the algorithm.
The for loop you have here has, n + 1 units because, the condition I < n is checked n + 1 time.
And for space complexity:
We have 4 variables A, n, s, and i. A is an array that can
have n elements, hence going to take n space. Rest variables n, s, and i
have fixed values so the space taken is 1 unit each. (so, we have n+3)
So, Here
Time Complexity: 2n + 3 : O(n)
Space Complexity: n + 3 : O(n)
Example 2:
Algorithm Add(A, B, n){
for (i=0; i < n; i++){ --> n + 1
for (j = 0; j < n; j++){ --> n(n+1) c[i,j] = A[i, j] + B[i, j] --> n x n } } }
Time Complexity: (n+1)+n(n+1)+n2 = n2 + 2n + 1 =
O(n2)
Space Complexity:
O(n2)
Example 3:
Algorithm Multiply(A, B, n){ for (i = 0; i < n; i++){ --> n+1 for (j = 0; j < n; j++){ --> n(n+1) c[i][j] = 0; --> nxn for (k = 0; k < n; k++){ --> n x n x n+1 c[i, j] = c[i, j] + A[i, k] * B[k, j]; --> n x n x n } } } }
Time Complexity: (n+1)+n(n+1)+n2+n2(n+1)+n3 =
2n3+3n2+2n+1 = O(n3)
Space Complexity: O(n2)
Example 4:
for (i = 1; i < n; i = i+2){
sum++; --> n/2
}
Time Complexity: n/2 = O(n)
Example 5:
for (i = 0; i < n; i++){ --> n+1 for (j = 0; j < n; j++){ --> n(n+1) sum++; --> n(n) } }
Time Complexity: n + 1 + n2 + n + n2 = 2n2 + 2n + 1 = O(n2)
Example 6:
for (i = 0; i < n; i++){ for (j = 0; j < i; j++){ stmt; } }
Time Complexity: 0 + 1 + 2 + 3 + ... + n = n(n+1)/2 = (n2 + n)/2 = O(n2)
Example 7:
p = 0;
for (i = 0; p <= n; i++){
p = p + i;
}
Assume: p > n (it will stop)
∵ p = k(k+1)/2
k(k+1)/2 > n (it will stop)
k2 > n
∴ k > √n
∴ it will execute for O(√n)
Example 8:
for (i = 1; i < n; i=i*2){
sum ++;
}
Assume i >= n (it will stop)
i = 2k
2k >= n
2k = n
k = log2n
Time Complexity: O(log2 n)
Example 9:
for (i = n; i > = 1; i = 1/2)
stmt1;
Assume i < 1
n/2k < 1
for simplicity above can be written as,
n/2k = 1
n = 2k
log2 n = k
Time Complexity: O(log2 n)
Example 10:
for (i = 0; i*i < n; i++)
stmt;
i*i >= n
i2 = n
i = √n
Time Complexity: O(√n)
Try this out...
Q.11)
for (i = 0; i < n; i++)
stmt1;
for (j = 0; j < n; j++)
stmt2;
Q.12)
p = 0
for (i = 1; i < n; i = i*2)
p++;
for (j = 1; j < p; j = j*2)
stmt;
Q.13)
for (i = 0; i < n; i++) { for (j = 1; j < n; j = j * 2) { stmt; } }
For Summarizing.
We have,
for(i = 0; i < n; i++) --- O(n) for(i = 0; i < n; i = i+2) --- O(n) for(i = n; i>1; i--) --- O(n) for(i = 1; i < n; i=i*2) --- O(log2 n) for(i = 1; i < n; i=i*3) --- O(log3 n) for(i = n; i>1; i = i/2) --- O(log2 n)
As we have for loop, similarly we have complexity analysis of while loop.
Example 14.
i = 0; -- 1 while (i<n){ -- n + 1 stmt; -- n i++; -- n } ==================== f(n) = 3n+1 = O(n)
Time Complexity: O(n)
Example 15:
i = 1; k = 1; while (k<n){ stmt; k = k + 1; i++; }
Analysis:
Values of x Values of k
1 1
2 1+1
3 1+1+2
4 1+1+2+3
5 1+1+3+4
: :
: :
m 1+2+3+4+...+m
So, we have
k ≥ n
m(m+1)/2 ≥ n
m2 ≥ n (Approx.)
∴ f(n) = O(√n)
Time Complexity: O(√n)
Try this out...
Q. 16)
while (m!=n){ if (m>n) m = m -n; else n = n-m; }
Q.17)
Algorithm Test(n){ if (n < 5) print "%d ", n; else { for ( i = 0; i < n; i++) print "%d ", i; } }
So, By this I End this post, next we will be discussing, Analysis Of Recursion
It's Harsh Jaiswal
Signing of...
Thanks...
In the previous post (Click to visit), we have already studied about Best, Average, and Worst-Case.
Now let's talk about Notations.
There are three types of Notations (mainly):
![]() |
Big O Notation: f(n) ≤ c g(n) |
![]() |
Omega Notation: f(n) ≥ c g(n) |
![]() |
Theta Notation: c2 g(n) ≤ f(n) ≤ c1 g(n) |
for (i = 0; i < 500; i++)
statement block;
for i in range(0, 500):
statement block
for (i = 0; i < 500; i += 2)
statement block;
for i in range(0, 500, 2):
statement block
for (i = 1; i < 1000; i *= 2) for (i = 1000; i >= 1; i/=2) statement block; statement block;
for i in range(1, 1000): for i in range(1000, 1, i/2): statement block statement block
for (i = 0; i < 10; i++) -- (n) times for (j = 1; j < 10; j *= 2) -- (log n) times statement blocks;
for i in range(0, 10): -- (n) times for j in range(1, 10): -- (log n) times statement blocks; j *= 2;
for (i = 0; i< 10; i++) -- n times for (j = 0; j < 10; j++) -- n times statement block;
for i in range(0, 10): -- n times for j in range(0, 10): -- n times statement block
for (i =0; i < 10; i++)
for (j = 0; j <= i; j++)
statement block;
for i in range(0, 10): for j in range(0, i+1): statement block
Problem Statement: The Sieve of Eratosthenes is a method for finding all primes up to (and possibly including) a given natural n . n . Th...