CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms
CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms
CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms
Lecture #3
Analysis of Recursive
Algorithms
Alon Halevy
Fall Quarter 2000
Nested Dependent Loops
for i = 1 to n do
for j = i to n do
sum = sum + 1
n n n n n
i 1 j i
1 (n i 1) (n 1) i
i i 1 i 1
n(n 1) n(n 1)
n(n 1) n2
2 2
Recursion
• A recursive procedure can often be analyzed
by solving a recursive equation
• Basic form:
T(n) = if (base case) then some constant
else ( time to solve subproblems +
time to combine solutions )
• Result depends upon
– how many subproblems
– how much smaller are subproblems
– how costly to combine solutions (coefficients)
Example: Sum of Integer Queue
sum_queue(Q){
if (Q.length == 0 ) return 0;
else return Q.dequeue() +
sum_queue(Q); }
– One subproblem
– Linear reduction in size (decrease by 1)
– Combining: constant c (+), 1×subproblem
Equation: T(0) b
T(n) c + T(n – 1) for n>0
Sum, Continued
Equation: T(0) b
T(n) c + T(n – 1) for n>0
Solution:
T(n) c + c + T(n-2)
c + c + c + T(n-3)
kc + T(n-k) for all k
nc + T(0) for k=n
cn + b = O(n)
Example: Binary Search
7 12 30 35 75 83 87 90 97 99
T(n) T(n/2) + c
T(n/4) + c + c
T(n/8) + c + c + c
T(n/2k) + kc
T(1) + c log n where k = log n
b + c log n = O(log n)
Example: MergeSort
Split array in half, sort each half, merge together
– 2 subproblems, each half as large
– linear amount of work to combine
T(1) b
T(n) 2T(n/2) + cn for n>1
3 4
3
2
1 2
1 2
0 1
0 1
0 1
Learning from Analysis
• To avoid recursive calls
– store all basis values in a table
– each time you calculate an answer, store it in the table
– before performing any calculation for a value n
• check if a valid answer for n is in the table
• if so, return it
• Memoization
– a form of dynamic programming
• How much time does memoized version take?
Kinds of Analysis
• So far we have considered worst case analysis
• We may want to know how an algorithm performs
“on average”
• Several distinct senses of “on average”
– amortized
• average time per operation over a sequence of operations
– average case
• average time over a random distribution of inputs
– expected case
• average time for a randomized algorithm over different random
seeds for any input
Amortized Analysis
• Consider any sequence of operations applied to a
data structure
– your worst enemy could choose the sequence!
• Some operations may be fast, others slow
• Goal: show that the average time per operation is
still good
total time for n operations
n
Stack ADT
A E D C BA
B
• Stack operations C
– push D
– pop E
F F
– is_empty
• Stack property: if x is on the stack before y is
pushed, then x will be popped after y is popped
What is biggest problem with an array implementation?
Stretchy Stack Implementation
int data[]; Best case Push = O( )
int maxsize;
int top; Worst case Push = O( )
Push(e){
if (top == maxsize){
temp = new int[2*maxsize];
copy data into temp;
deallocate data;
data = temp; }
else { data[++top] = e; }
Stretchy Stack Amortized
Analysis
• Consider sequence of n operations
push(3); push(19); push(2); …
• What is the max number of stretches?
log n
• What is the total time?
– let’s say a regular push takes time a, and stretching an array contain k elements
takes time kb, for some constants a and b.
log n
an b(1 2 4 8 ... n) an b 2i
i o