Tuesday, January 12, 2021

thumbnail

Asymptotic Notation - 22 Weeks of DSA

 In the previous post (Click to visit), we have already studied about Best, Average, and Worst-Case. 

Order of Growth: (It's very important)

c < log log n < log n < n1/3 < n1/2 < n < n2 < n3 < n4 < 2n < nn

Now let's talk about Notations.

There are three types of Notations (mainly):

  • BIG O Notation
  • BIG OMEGA Notation (Ω) and
  • BIG THETA Notation (Θ)
Let's check each one by one...

Big O Notation:

Okay so consider, you have 2 algorithms, one which takes 10 iterations and the other which takes 20 iterations, no issues on that both will have almost the same efficiency, but when it comes to 10 iterations and 1000 iterations, then it is a matter of concern.
The Big O notation, where O stands for 'order of' is concerned with what happens for the huge values of n. For example, if sorting the algorithm performs n*n operations to sort just n elements, then that the algorithm would be described as O(n*n) algorithm. 
When expressing complexity using Big O notation, constant multipliers are ignored. That means an O(4n) is equivalent to O(n)

If f(n) and g(n) are the functions defined on a positive integer number n, then
f(n) = O(g(n))  --- read as 'f of n is Big O of g of n'
and this is possible if and only if positive constants c and n exists, such that
f(n) ≤ cg(n)
It means that for a large amount of data, f(n) will grow no more than a constant factor than g(n). Hence g provides an upper bound.

Note that here c is a constant which depends on the following factors:
  • the programming language used,
  • the quality of the compiler or interpreter,
  • the knowledge of programmer,
  • the size of the main memory and the access time to it,
  • the CPU speed, and
  • the algorithm itself.
Point to remember: the Big O notation provides a strict upper bound for f(n). That means that function f(n) can do better but not worse then the specified value.

Mathematically speaking
        If f(n) ≤ cg(n), c > 0,∀ n ≥ n0, then f(n) = O(g(n)) and g(n) is an asymptotically tight upper bound for f(n)

Example of function in O(n3) include n2.9, n3, n+ n, 540n+ 10.
And the examples which not comes in O(n3): n3.2, n2, n2+ n, 540n + 10, 2n.
Big O Notation: f(n) ≤ c g(n)


Categories of Algorithms:

A/c to Big O we have five different categories of algorithms...
  • Constant time algorithm: O(1)
  • Linear time algorithm: O(n)
  • Logarithmic time algorithm: O(log n)
  • Polynomial-time algorithm: O(nk) where k > 1
  • Exponential time algorithm: O(2n).

Limitations of Big O Notation: 

  • Many algorithms are simply too hard to analyze mathematically.
  • There may not be sufficient information to calculate the behavior of an algorithm in the average class.
  • Big O analysis only tells us how the algorithm grows with the size of the problem, not how efficient it is, as it does not consider the programming effort. 
  • It ignores important constants. 

Omega Notation (𝛀):

The Omega notation provides a tight lower bound for f(n). This means that the function can ever do better than the specified value but it may be worse.
Ω notation is simply written as f(n) ∈ Ω(g(n)), where n is the problem size and 
Ω(g(n)) = {h(n): ∃ positive constant c > 0, no such that≤ cg(n) ≤ h(n), ∀ n ≥ n0}.
Hence, we can say that Ω(g(n)) comprises a set of all the functions h(n) that are greater than or equal to cg(n) for all values of n ≥ no.

If cg(n) ≤ f(n), c < 0, ∀ n ≥ no, then f(n) ∈ Ω(g(n)) and g(n) is an asymptotically tight lower bound

Examples of  functions in Ω(n2) include: n2, n3+n2, n3, n2.9
Examples of functions not in Ω(n3) include: n2, n2.9, n


Omega Notation: f(n) ≥ c g(n)
To summarize,
  • Best Case Ω describes a lower bound for all combinations of input.
  • Worst case Ω describes a lower bound for worst-case input combinations.
  • If we simply write Ω,  it means the same as best-case Ω.

Theta Notation (Ө):

Theta notation provides an asymptotically tight bound for f(n). Ө notation is simply written as, f(n) ∈ Ө(g(n)), where n is the problem size and 
Ө(g(n)) = {h(n): ョpositive constant c1, c2, and n0 such that 0 ≤ c1g(n) ≤ h(n) ≤ c2g(n), ∀ n ≥ n0}.
Hence, we say that Ө(g(n)) comprises a set of all the functions h(n) that are between c1g(n) and c2g(n) for all values of n ≥ n0.
Theta Notation: c2 g(n) ≤ f(n) ≤ c1 g(n)


To summarize,
  • The best case in Ө notation is not used.
  • Worst case Ө describes asymptotic bounds for the worst-case combination of input values.
  • If we simply write Ө, it means the same as worst-case Ө.
In the upcoming post, we will be calculating the complexities of the expressions...
By that,
Harsh Jaiswal
Signing of...

Subscribe by Email

Follow Updates Articles from This Blog via Email

No Comments

Sieve Of Eratosthenes | Mathematics | Data Structures and Algorithm | Python

Problem Statement: The  Sieve of Eratosthenes  is a method for finding all primes up to (and possibly including) a given natural n . n .  Th...