DAA (CS14103) : Department of Computer Science and Engineering

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

DAA (CS14103)

Introduction

Department of Computer Science and Engineering


Algorithm
● An algorithm is a sequence of computational steps that transform the input
into the output.
● An Algorithm is a finite set of instructions that accomplishes a particular
task.
Properties of an Algorithm

1. Input. Zero or more quantities are externally supplied.


2. Output. At least one quantity is produced.
3. Definiteness. Each instruction is clear and unambiguous.
4. Finiteness. The algorithm terminates after a finite number of steps.
5. Effectiveness. Every instruction must very basic so that it can be carried out
by a person using only pencil & paper.
Algorithm Analysis

Analysis of algorithm depend on the time and memory taken by the algorithm.

● Space complexity: The amount of memory/space taken by the algorithm to


run for completion.
● Time complexity: The amount of computer time required by the algorithm
to run to completion.

Time complexity is given more preference than space complexity.


Algorithm Analysis
Types of Analysis
1. Worst Case (Usually)
2. Average Case (Some Times)
3. Best Case (Not Reliable)

Challenge: How to make it machine independent?

The Big idea is – Growth rate of time (Asymptotic Analysis)


Growth of Functions
Growth of Functions

● Suppose ‘M’ is an algorithm, and suppose ‘n’ is the size of the input data.
Clearly the complexity f(n) of M increases as n increases.
● It is usually the rate of increase of f(n) we want to examine. This is usually
done by comparing f(n) with some standard functions.
● The most common computing times are:

O(1), O(log2n), O(n), O(nlog2n), O(n2), O(n3), O(2n), n! and nn


Asymptotic Notation

● Asymptotic notations are the mathematical notations used to describe the


running time of an algorithm when the input tends towards a particular value
or a limiting value.
● The efficiency of an algorithm depends on the amount of time, storage and
other resources required to execute the algorithm. The efficiency is measured
with the help of asymptotic notations.
● The study of change in performance of the algorithm with the change in the
order of the input size is defined as asymptotic analysis.
Types of Asymptotic Notations
We use three types of asymptotic notations to represent the growth of any
algorithm, as input increases:
1. Big Theta (Θ)
2. Big Oh(O)
3. Big Omega (Ω)
4. Little Oh (o)
5. Little Omega (ω)
Big Oh (O)
Definition: The function f(n)=O(g(n)) (read as “f of n is big oh of g of n”) iff
there exist positive constants c and n0 such that,
f(n)≤c*g(n) for all n, n≥0
The value g(n) is the upper bound value of f(n).
Big Oh (O): Examples
1. 3n+2=O(n) as
3n+2 ≤ 4n for all n≥2

2. 2n3 + 3n2 + n = O(n3)


2n3 + 3n2 + n ≤ 3n3 for all n≥4
Big Omega (Ω)
Definition: The function f(n)=Ω (g(n)) (read as “f of n is Omega of g of n”) iff
there exist positive constants c and n0 such that
f(n)≥c*g(n) for all n, n≥0
The value g(n) is the lower bound value of f(n).
Big Omega (Ω): Examples
1. 3n + 2=Ω(n) as
3n + 2 ≥ 3n for all n≥1

2. 2n3 + 3n2 + n = Ω(n3)


2n3 + 3n2 + n ≥ 2n3 for all n≥1
Big Theta (Θ)
Definition: The function f(n)= θ (g(n)) (read as “f of n is theta of g of n”) iff there
exist positive constants c1, c2 and n0 such that,
c1*g(n) ≤f(n)≤c2*g(n) for all n, n≥0
Big Theta (Θ): Examples
1. 3n + 2=θ(n) as
3n+2 ≥3n for all n≥2
3n+2 ≤4n for all n≥2
Here c1=3 and c2=4 and n0=2

2. 2n3 + 3n2 + n = θ(n3)


2n3 + 3n2 + n ≥ 2n3 for all n≥4
2n3 + 3n2 + n ≤ 3n3 for all n≥4
Here c1=2 and c2=3 and n0=4
Little Oh (o)
Definition: The function f(n)=o(g(n)) (read as “f of n is little oh of g of n”) iff
there exist positive constants c (c > 0) and n0 (n0 > 0) such that,
f(n) < c g(n) for all n, n≥0

● It is an Asymptotic Notation to denote the upper bound (that is not


asymptotically tight) on the growth rate of runtime of an algorithm.
● Using mathematical relation, we can say that f(n) = o(g(n)) means,
Little Oh (o): Examples
1. If f(n) = n2 and g(n) = n3 then check whether f(n) = o(g(n)) or not.

The result is 0, and it satisfies the equation mentioned above. So we can say
that f(n) = o(g(n)).
Little Omega (ω)
Definition: The function f(n)=ω(g(n)) (read as “f of n is little omega of g of n”)
iff there exist positive constants c (c > 0) and n0 (n0 > 0) such that,
f(n) > c g(n) for all n, n≥0

● It is an Asymptotic Notation to denote the lower bound (that is not


asymptotically tight) on the growth rate of runtime of an algorithm.
● Using mathematical relation, we can say that f(n) = ω(g(n)) means,
Little Omega (ω): Examples
1. For example, n2/2 = ω(n), but n2/2 ≠ ω(n2). The relation f(n) = ω(g(n))
implies that

That is, f(n) becomes arbitrarily large relative to g(n) as n approaches infinity.

You might also like