0% found this document useful (0 votes)
157 views2 pages

Exercises #2 Asymptotic Analysis: Theoretical Background

This document discusses asymptotic analysis and big O notation. It provides definitions for common asymptotic notations like O, Ω, Θ, o, and ω. It then presents examples assessing whether functions are O, o, Ω, ω, or Θ of other functions. It also discusses conjectures about asymptotic notations and growth ratios for different functions.

Uploaded by

Marlon Tugwete
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
157 views2 pages

Exercises #2 Asymptotic Analysis: Theoretical Background

This document discusses asymptotic analysis and big O notation. It provides definitions for common asymptotic notations like O, Ω, Θ, o, and ω. It then presents examples assessing whether functions are O, o, Ω, ω, or Θ of other functions. It also discusses conjectures about asymptotic notations and growth ratios for different functions.

Uploaded by

Marlon Tugwete
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 2

Algorithms (CC4010) - 2018/2019 DCC/FCUP

Exercises #2
Asymptotic Analysis

Theoretical Background
Remember the asymptotic notation:

• f (n) = O(g(n)) if there exist positive constants n0 and c such that f (n) ≤ cg(n) for all n ≥ n0 .

• f (n) = Ω(g(n)) if there exist positive constants n0 and c such that f (n) ≥ cg(n) for all n ≥ n0 .
• f (n) = Θ(g(n)) if there exist positive constants n0 , c1 and c2 such that c1 g(n) ≤ f (n) ≤ c2 g(n) for all
n ≥ n0 .
• f (n) = o(g(n)) if for any positive constant c there exists n0 such that f (n) < cg(n) for all n ≥ n0 .

• f (n) = ω(g(n)) if for any positive constant c there exists n0 such that f (n) > cg(n) for all n ≥ n0 .

Asymptotic Notation
1. Is 2n+1 = O(2n )? Is 22n = O(2n ). Justify your answer with brief proofs.

2. For each pair of functions f (n) and g(n), indicate whether f (n) is O, o, Ω, ω, or Θ of g(n). Your answer
should be in the form of a ”yes” or ”no” for each cell of the table.
f (n) g(n) O o Ω ω Θ
3 2 2
(a) 2n − 10n 25n + 37n
(b) 56 log2 30
(c) log3 n log2 n
(d) n3 3n
(e) n! 2n
(f) n! nn
(g) n log2 n + n2 n2

(h) n log2 n
(i) log3 (log3 n) log3 n
(j) log2 n log2 n2

1
3. For each of the following conjectures, indicate if they are true or false, explaining why.
You can assume that functions f (n) and g(n) are asymptotically positive, i.e., they are positive from
some point on (∃n0 : f (n) > 0 for all n ≥ n0 )

(a) f (n) = O(g(n)) implies that g(n) = O(f (n))


(b) f (n) = O(g(n)) implies that g(n) = Ω(f (n))
(c) f (n) + g(n) = Θ(min(f (n), g(n)))
(d) f (n) + g(n) = Θ(max(f (n), g(n)))
(e) (n + c)k = Θ(nk ), where c and k are positive integer constants
(f) f (n) + o(f (n)) = Θ(f (n))
(g) n2 = Θ(16log4 n )

Growth Ratio
4. Imagine a program A running with time complexity Θ(f (n)), taking t seconds for an input of size
k. What would your estimation be for the execution time for an input of size 2k for the following
functions: n, n2 , n3 , 2n , log2 n. Is this growth ratio constant for any k or is it changing?

5. Consider two programs implementing algorithms A and B, both trying to solve the same problem for
an input of size n. They measured the execution times for test cases of different sizes and got the
following table:
Algorithm n = 100 n = 200 n = 300 n = 400 n = 500
A 0.003s 0.024s 0.081s 0.192s 0.375s
B 0.040s 0.160s 0.360s 0.640s 1.000s

(a) Which program is more efficient? Why?


(b) Could you produce a program that uses both algorithms in order to produce an algorithm C that
would be at least as good as A and B for any test case?

You might also like