Design and Analysis of Algorithms - Old Questions

1.  What do you mean by complexity of an algorithm? Explain about the asymptotic notations used to describe the time/space complexity of any algorithm with their geometrical interpretation and example.    (1+9)

10 marks | Asked in 2076 (new)

Complexity of an algorithm is a measure of the amount of time and/or space required by an algorithm for an input of a given size (n). The complexity of an algorithm can be divided into two types: The time complexity and the space complexity.

  • Time complexity of an algorithm is the measure of total time required to execute the algorithm.
  • The space complexity is defined as how much space is required to execute an algorithm.

Asymptotic notations are the general representation of time and space complexity of an algorithm. Majorly, we use three types of Asymptotic Notations and those are as follows:

1. Big-Oh notation:

A function f(n) is said to be “big-Oh of  g(n)” and we write, f(n)=O(g(n)) or simply f=O(g), if there are two positive constants c and n0 such that f(n)<=c*g(n) for all n>=n0.

E.g. The big oh notation of  f(n)=n+4 is O(n) since n+4<=2n for all n>=4.

The big oh notation of  f(n)=n2+3n+4 is O(n2) since n2+3n+4<=2n2  for all n>=4.


Big O notation specifically describes worst case scenario. It represents the upper bound running time complexity of an algorithm.


2. Big-Omega notation:

A function f(n) is said to be “big-Omega of  g(n)” and we write, f(n)=Ω(g(n)) or simply f=Ω(g), if there are two positive constants c and n0 such that f(n)>=c*g(n) for all n>=n0.


E.g. The big omega notation of f(n)=5n+6  is Ω(n) since 5n+6>=5n for all n>=1.

The big omega notation of f(n)=3n2+2n+4   is Ω(n2) since 3n2+2n+4 >=3n2 for all n>=1.

 

Big-Omega notation specifically describes best case scenario. It represents the lower bound running time complexity of an algorithm. 


3. Big Theta notation:

A function f(n) is said to be “big-Theta of g(n)” and we write, f(n)= Θ(g(n)) or simply f= Θ(g), if there exist positive constants c1, c2 and n0 such that c1*g(n)<=f(n)<= c2*g(n) for all n>=n0.

E.g. The big theta notation of  f(n)=5n+2  is Θ(n) since 6n>=5n+2>=5n for all n>=3.

The big theta notation of f(n)=3n2+2n+1 is Θ(n2) since 4n2 >=3n2+2n+1>=3n22 for all n>=3. 

This notation describes both upper bound and lower bound of an algorithm so we can say that it defines exact asymptotic behaviour.