Time Complexity NOtes
Time Complexity NOtes
Time Complexity NOtes
Introduction
An important question while programming is: How efficient is an algorithm or a
piece of code?
Efficiency covers a lot of resources, including:
Algorithm Analysis
Algorithm analysis is an important part of computational complexity theory, which
provides theoretical estimation for the required resources of an algorithm to solve
a specific computational problem. Analysis of algorithms is the determination of the
amount of time and space resources required to execute it.
1
Types of Analysis
To analyze a given algorithm, we need to know, with which inputs the algorithm
takes less time (i.e. the algorithm performs well) and with which inputs the
algorithm takes a long time.
Three types of analysis are generally performed:
• Worst-Case Analysis: The worst-case consists of the input for which the
algorithm takes the longest time to complete its execution.
• Best Case Analysis: T
he best case consists of the input for which the algorithm
takes the least time to complete its execution.
• Average case: The average case gives an idea about the average running time of
the given algorithm.
2
Big-O notation
We can express algorithmic complexity using the b
ig-O notation. For a problem of
size N:
Definition: Let g and f be functions from the set of natural numbers to itself. The
function f is said to be O
(g) (read big-oh of g), if there is a constant c
and a natural
n0 such that f (n) ≤ cg(n) for all n > n0.
Note: O
(g) is a set!
Abuse of notation: f = O(g) does not mean f ∈ O(g).
Examples:
Although we can include constants within the big-O notation, there is no reason to
do that. Thus, we can write O(5n + 4) = O(n).
Note: The b
ig-O expressions do not have constants or low-order terms. This is
because, when N gets large enough, constants and low-order terms don't matter (a
constant-time function/method will be faster than a linear-time function/method,
which will be faster than a quadratic-time function/method).
3
;
statement 1
statement 2 ;
...
statement k;
The total time is found by adding the times for all statements:
2
. if-else statements
if (condition):
#sequence of statements 1
else:
#sequence of statements 2
For example, if sequence 1 is O(N) and sequence 2 is O(1) the worst-case time for
the whole if-then-else statement would be O
(N).
3. for loops
for i in range N:
#sequence of statements
Here, the loop executes N times, so the sequence of statements also executes N
times. Now, assume that all the statements are of the order of O(1), then the total
time for the for loop is N
* O(1), which is O
(N) overall.
4
4. Nested loops
Sample Problem:
What will be the Time Complexity of following while loop in terms of ‘N’ ?
while N>0:
N = N//8
5
N//8k = 1
=> N = 8k
=> log(N) = log(8k )
=> k*log(8) = log(N)
=> k = log(N)/log(8)
=> k = log8(N)
Now, clearly the number of iterations in this example is coming out to be of the
order of l
og8(N). Thus, the time complexity of the above while loop will be
O(log8(N)).
Qualitatively, we can say that after every iteration, we divide the given number by 8,
and we go on dividing like that, till the number remains greater than 0. This gives
the number of iterations as O
(log8(N)).
Linear Search
W
orst Case- In the worst possible case:
● The element being searched may be present in the last position or may not
present in the array at all.
● In the former case, the search terminates in success with N comparisons.
● In the latter case, the search terminates in failure with N comparisons.
● Thus in the worst case, the linear search algorithm takes O
(N) operations.
6
Binary Search
Binary Search time complexity analysis is done below-
● In each iteration or each recursive call, the search gets reduced to half of the
array.
● So for N
elements in the array, there are log2N iterations or recursive calls.
Thus, we have-
7