CS161: Design and Analysis of Algorithms
CS161: Design and Analysis of Algorithms
CS161: Design and Analysis of Algorithms
Lecture 2
Leonidas Guibas 1
Outline
Review of last lecture
Asymptotic notations
Big O, big Ω, Θ, etc
2
Correctness of Algorithms
3
Efficiency of Algorithms
Worst case
Provides an upper bound on running time
An absolute guarantee
Best case – not very useful
Average case
Provides the expected running time
Very useful, but treat with care: what is “average”?
Random (equally likely) inputs
Real-life inputs
6
Analysis of Insertion Sort
InsertionSort(A, n) {
for j = 2 to n {
key = A[j]
i = j - 1;
while (i > 0) and (A[i] > key) {
A[i+1] = A[i]
i = i - 1
}
How many times will
A[i+1] = key
this line execute?
}
}
7
Analysis of Insertion Sort
InsertionSort(A, n) {
for j = 2 to n {
key = A[j]
i = j - 1;
while (i > 0) and (A[i] > key) {
A[i+1] = A[i]
i = i - 1
}
How many times will
A[i+1] = key
this line execute?
}
}
8
Analysis of Insertion Sort
Statement cost time__
InsertionSort(A, n) {
for j = 2 to n { c1 n
key = A[j] c2 (n-1)
i = j - 1; c3 (n-1)
while (i > 0) and (A[i] > key) { c4 S
A[i+1] = A[i] c5 (S-(n-1))
i = i - 1 c6 (S-(n-1))
} 0
A[i+1] = key c7 (n-1)
} 0
}
S = t2 + t3 + … + tn where tj is number of while
expression evaluations for the jth for loop iteration 9
Analyzing Insertion Sort
T(n) = c1n + c2(n-1) + c3(n-1) + c4S + c5(S - (n-1)) + c6(S - (n-1)) + c7(n-1)
= c8S + c9n + c10
What can S be?
Best case -- inner loop body never executed
tj = 1 S = n - 1
T(n) = an + b is a linear function
Worst case -- inner loop body executed for all previous
elements
tj = j S = 2 + 3 + … + n = n(n+1)/2 - 1
T(n) = an2 + bn + c is a quadratic function
Average case
Can assume that on average, we have to insert A[j] into the
middle of A[1..j-1], so tj = j/2
S ≈ n(n+1)/4
T(n) is still a quadratic function
10
Asymptotic Analysis
Abstract statement costs (don’t care about c1,
c2, etc)
Order of growth (as a function of n, the input
size) is the interesting measure:
Highest-order term is what counts
As the input size grows larger it is the high order term that
dominates x 10
4
4
100 * n
n2
3
T(n)
0 11
0 50 100 150 200
Comparison of functions
log2n n nlog2n n2 n3 2n n!
10 3.3 10 33 102 103 103 106
102 6.6 102 660 104 106 1030 10158
103 10 103 104 106 109
104 13 104 105 108 1012
105 17 105 106 1010 1015
106 20 106 107 1012 1018
13
Asymptotic Notations
14
Asymptotic Notations
O: Big-Oh
Ω: Big-Omega
Θ: Theta
o: Small-oh
ω: Small-omega
1
Big “O”
Informally, O(g(n)) is the set of all
functions with a smaller or same order of
growth as g(n), within a constant multiple
If we say f(n) is in O(g(n)), it means that
g(n) is an asymptotic upper bound on f(n)
Formally:
∃ C (>0) & n0, f(n) ≤ Cg(n) for ∀ n >= n0
What is O(n2)?
The set of all functions that grow slower than
or at the same order as n2
1
Big “O”
1
Small “o”
𝑓𝑓 𝑛𝑛
→ 0 𝑎𝑎𝑎𝑎 𝑛𝑛 → ∞
𝑔𝑔 𝑛𝑛
1
Big “Ω” [Omega]
2
Theta (“Θ”): Θ = O and Ω
So:
n2 ∈ Θ(n2)
n2 + n ∈ Θ(n2) Intuitively, Θ is like =
100n2 + n ∈ Θ(n2)
100n2 + log2n ∈ Θ(n2)
But:
nlog2n ∉ Θ(n2)
1000n ∉ Θ(n2)
1/1000 n3 ∉ Θ(n2)
22
Tricky Cases
23
Big “O”, Formally
There exist
Definition: For all
24
Big “O”, Example
25
Big “Ω”, Formally
Definition:
Ω(g(n)) = {f(n): ∃ positive constants C and n0
such that 0 ≤ Cg(n) ≤ f(n) ∀ n>n0}
lim n→∞ f(n)/g(n) > 0 (if the limit exists.)
Abuse of notation (for convenience):
f(n) = Ω(g(n)) actually means f(n) ∈ Ω(g(n))
26
Big “Ω”, Example
Claim: f(n) = n2 / 10 = Ω(n)
Alternatively:
lim n→∞ f(n)/g(n) = lim n→∞ (n/10) = ∞
27
Big “Θ”, Formally
Definition:
Θ(g(n)) = {f(n): ∃ positive constants c1, c2,
and n0 such that 0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n),
∀ n ≥ n0 }
lim n→∞ f(n)/g(n) = c > 0 and c < ∞
f(n) = O(g(n)) and f(n) = Ω(g(n))
Abuse of notation (for convenience):
f(n) = Θ(g(n)) actually means f(n) ∈ Θ(g(n))
Θ(1) means constant time.
28
Big “Θ”, Example
Alternatively, limn→∞(2n2+n)/n2 = 2
29
More Examples
Prove n2 + 3n + lg n is in O(n2)
Want to find c and n0 such that
n2 + 3n + lg n <= cn2 for n > n0
Proof:
n2 + 3n + lg n <= 3n2 + 3n + 3lgn for n > 1
<= 3n2 + 3n2 + 3n2
<= 9n2
Or n2 + 3n + lg n <= n2 + n2 + n2 for n > 10
<= 3n2
30
More Examples
Prove n2 + 3n + lg n is in Ω(n2)
Want to find c and n0 such that
n2 + 3n + lg n >= cn2 for n > n0
31
O, Ω, and Θ
32
Using Limits to Compare Orders of
Growth
f(n) ∈ o(g(n))
0 f(n) ∈ O(g(n))
lim f(n) / g(n) = c >0 f(n) ∈ Θ (g(n))
n→∞
∞ f(n) ∈ Ω(g(n))
f(n) ∈ ω (g(n))
33
Logarithms
34
Exponentials
Compare 2n and 3n
lim 2n / 3n = lim(2/3)n = 0
n→∞ n→∞
Therefore, 2n ∈ o(3n), and 3n ∈ ω(2n)
35
L’ Hopital’s Rule
Condition:
lim f(n) / g(n) = lim f’(n) / g’(n)
n→∞ n→∞ If both lim f(n) and
lim g(n) are ∞ or 0
lim n0.5 / ln n = ?
n→∞
37
Stirling’s Formula (Useful)
n
n
n! ≈ 2π n = 2π n n +1 / 2 − n
e
e
n +1 / 2 − n
n! ≈ (constant) n e
38
n
n! c nnn n
lim n = lim n n = lim c n = ∞
Compare 2n and n! n →∞ 2 n →∞ 2 e n →∞
2e
Therefore, 2n = o(n!)
n! c nn n c n
lim n = lim n n = lim n = 0
Compare nn and n! n →∞ n n →∞ n e n →∞ e
Therefore, nn = ω(n!)
e
1
= C + n log n + log n − n
2
n n 1
= C + log n + ( log n − n ) + log n
2 2 2
= Θ( n log n )
40
More Advanced Dominance
Rankings
41
Asymptotic Notation Summary
O: Big-Oh
Ω: Big-Omega
Θ: Theta
o: Small-oh
ω: Small-omega
Intuitively:
O is like ≤ Ω is like ≥ Θ is like =
o is like < ω is like >
42
Properties of Asymptotic Notations
48
Recursive Algorithms
General idea:
Divide a large problem into smaller ones
By a constant ratio
By a constant or some variable
Solve each smaller one recursively or
explicitly
Combine the solutions of smaller ones to
form a solution for the original problem
MERGE-SORT A[1 . . n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists.
50
Merging Two Sorted Arrays
Subarray 1 Subarray 2
20 12
13 11
7 9
2 1
51
Merging Two Sorted Arrays
Subarray 1 Subarray 2
20 12
13 11
7 9
2 1
52
Merging Two Sorted Arrays
20 12
13 11
7 9
2 1
53
Merging Two Sorted Arrays
20 12
13 11
7 9
2 1
54
Merging Two Sorted Arrays
20 12
13 11
7 9
2 1
55
Merging Two Sorted Arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
56
Merging Two Sorted Arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
1 2
57
Merging Two Sorted Arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2
58
Merging Two Sorted Arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2 7
59
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7
60
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
61
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
62
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11
63
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11
64
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11 12
65
How to Show the Correctness of a
Recursive Algorithm?
By induction:
Base case: prove it works for small examples
Inductive hypothesis: assume the solution is
correct for all sub-problems
Step: show that, if the inductive hypothesis is
correct, then the algorithm is correct for the
original problem.
66
Correctness of MergeSort
MERGE-SORT A[1 . . n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists.
Proof:
1. Base case: if n = 1, the algorithm will return the correct answer
because A[1..1] is already sorted.
2. Inductive hypothesis: assume that the algorithm correctly sorts
A[1.. n/2 ] and A[n/2+1..n].
3. Step: if A[1.. n/2 ] and A[n/2+1..n] are both correctly sorted, the
whole array A[1.. n/2 ] and A[n/2+1..n] is sorted after merging.
67
How to Analyze the Time-Efficiency
of a Recursive Algorithm?
68
Analyzing MergeSort
69
Analyzing MergeSort
1. Divide: Trivial.
2. Conquer: Recursively sort 2 subarrays.
3. Combine: Merge two sorted subarrays
T(n) = 2 T(n/2) + f(n) +Θ(1)
# subproblems Dividing and
subproblem size
Combining
1. What is the time for the base case? Constant
2. What is f(n)?
3. What is the growth order of T(n)?
70
Merging Two Sorted Arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11 12
71
Recurrence for MergeSort
Θ(1) if n = 1;
T(n) =
2T(n/2) + Θ(n) if n > 1.
• Later we shall often omit stating the base
case when T(n) = Θ(1) for sufficiently
small n, but only when it has no effect on
the asymptotic solution to the recurrence.
72
Binary Search
To find an element in a sorted array, we
1. Check the middle element
2. If ==, we’ve found it
3. Else, if less than wanted, search right
half
4. else search left half
Example: Find 9
3 5 7 8 9 12 15
73
Binary Search
To find an element in a sorted array, we
1. Check the middle element
2. If ==, we’ve found it
3. Else, if less than wanted, search right
half
4. else search left half
Example: Find 9
3 5 7 8 9 12 15
74
Binary Search
To find an element in a sorted array, we
1. Check the middle element
2. If ==, we’ve found it
3. Else, if less than wanted, search right
half
4. else search left half
Example: Find 9
3 5 7 8 9 12 15
75
Binary Search
To find an element in a sorted array, we
1. Check the middle element
2. If ==, we’ve found it
3. Else, if less than wanted, search right
half
4. else search left half
Example: Find 9
3 5 7 8 9 12 15
76
Binary Search
To find an element in a sorted array, we
1. Check the middle element
2. If ==, we’ve found it
3. Else, if less than wanted, search right
half
4. else search left half
Example: Find 9
3 5 7 8 9 12 15
77
Binary Search
To find an element in a sorted array, we
1. Check the middle element
2. If ==, we’ve found it
3. Else, if less than wanted, search right
half
4. else search left half
Example: Find 9
3 5 7 8 9 12 15
78
Binary Search
n
T ( n ) = T + Θ(1)
2
T (1) = Θ(1)
80
Recursive InsertionSort
RecursiveInsertionSort(A[1..n])
1. if (n == 1) do nothing;
2. RecursiveInsertionSort(A[1..n-1]);
3. Find index i in A such that A[i] <= A[n] < A[i+1];
4. Insert A[n] after A[i];
81
Recurrence for InsertionSort
T ( n ) = T (n − 1) + Θ( n )
T (1) = Θ(1)
82
Compute Factorial
Factorial (n)
if (n == 1) return 1;
return n * Factorial (n-1);
83
Recurrence for Computing
Factorial
T ( n ) = T (n − 1) + Θ(1)
T (1) = Θ(1)
84
What do These Signify?
T (n ) = T (n − 1) + 1
T (n ) = T (n − 1) + n
T (n ) = T (n / 2 ) + 1
T (n ) = 2T (n / 2 ) + 1
85
Solving Recurrences
Running time of many algorithms can be
expressed in one of the following two
recursive forms
T ( n ) = aT ( n − b) + f ( n )
or
T ( n ) = aT (n / b) + f (n )
87