Big-O Algorithm Complexity Cheat Sheet
Big-O Algorithm Complexity Cheat Sheet
Searching
Space
Algorithm Data Structure Time Complexity Complexity
Depth First Search (DFS) Graph of |V| vertices - O(|E| + |V|) O(|V|)
and |E| edges
Shortest path by Dijkstra, Graph with |V| vertices O((|V| + |E|) O((|V| + |E|) O(|V|)
using a Min-heap as and |E| edges log |V|) log |V|)
http://bigocheatsheet.com/# 1/6
5/29/2014 Big-O Algorithm Complexity Cheat Sheet
priority queue
Shortest path by Dijkstra, Graph with |V| vertices O(|V|^2) O(|V|^2) O(|V|)
using an unsorted array and |E| edges
as priority queue
Shortest path by Bellman- Graph with |V| vertices O(|V||E|) O(|V||E|) O(|V|)
Ford and |E| edges
Sorting
Data Worst Case Auxiliary Space
Algorithm Structure Time Complexity Complexity
http://bigocheatsheet.com/# 2/6
5/29/2014 Big-O Algorithm Complexity Cheat Sheet
Sort
Data Structures
Data Space
Structure Time Complexity Complexity
Dynamic O(1) O(n) O(n) O(n) O(1) O(n) O(n) O(n) O(n)
Array
Singly- O(n) O(n) O(1) O(1) O(n) O(n) O(1) O(1) O(n)
Linked
List
Doubly- O(n) O(n) O(1) O(1) O(n) O(n) O(1) O(1) O(n)
Linked
List
Skip List O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n) O(n) O(n) O(n) O(n
log(n))
http://bigocheatsheet.com/# 3/6
5/29/2014 Big-O Algorithm Complexity Cheat Sheet
Binary O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n) O(n) O(n) O(n) O(n)
Search
Tree
B-Tree O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
Red-Black O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
Tree
AVL Tree O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
Heaps
Heaps Time Complexity
Extract Increase
Heapify Find Max Max Key Insert Delete Merge
http://bigocheatsheet.com/# 4/6
5/29/2014 Big-O Algorithm Complexity Cheat Sheet
List
(unsorted)
Graphs
Node / Edge Storage Add Add Edge Remove Remove Query
Management Vertex Vertex Edge
[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight
bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no
upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n
(Big O n log n).SO
[2] f(x)=Θ(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of
f(x) is asymptotically proportional to g(n).
[3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior.
algorithm performance
o(n) <n
O(n) ≤n
Θ(n) =n
Ω(n) ≥n
ω(n) >n
http://bigocheatsheet.com/# 6/6