1. Order Complexity Analysis #

Created Sunday 05 January 2020

The idea is to quantify the quality of the solution, in terms of time(steps) and space(memory).

  1. Time - This is very very important.

Scenario:

  1. Experimental Analysis: For a given algo the **simplest **method is to note the running time for input size 1, 10, 100, 1000 etc. Plot the graph and we can figure out the expression for time for a given n. F(n) = an + b (say). Or some other function.

**Confidence Boost: **Ankush has done the same thing which I did in semester IV. He used time(0). Yeah!!!

For 0.1 million, Merge sort ~ 0.15 seconds Selection sort ~ ( Hrs) 2400 seconds (Investment in algos is worth it) Comparing Time values

Hence, we won’t be doing this.

  1. Theoretical analysis**(This is a gold mine)**:

In this, we don’t need to take into account, the computer, or resources etc. Our pseudocode/algorithm is enough to find the time complexity.

This makes sense, as we think of the solution, then why to use experimentation to calculate time. This is one important thing we can do, for algorithms, at least.

Important Note: Suppose some algorithm takes 5 n^2^, is this accurate in seconds. No, but it is the steps(i.e no. of unit operations) that we will be doing. So, machine dependency can be ignored altogether. Confidence Boost: I thought the exact same, this is the best feeling.

For a function f(n). Big Oh is a way to give an upper bound.(May or may not be worst case). g(n) is O(f(n)), if g(n) <= k * f(n) for large values and a positive k. In Big Oh notation, we write:

  1. Number of unit operations.
  2. Only highest power term matters.
  3. We don’t care about the coefficients.

Q) Find largest element in an array. A) Can be done in single scan. We do (k constant) operations for each number. So k*n.This is theta(n). Hence it is also O(n).

Things like initialization, etc is constant, so it is ignored.

Q) Bubble sort for input size n. A) Suppose we swap every time. sigma(n-i) i=1 to n -> O(n^2^).