Dsa GFG C++
Dsa GFG C++
Dsa GFG C++
Greedy Algorithms: 3
Activity Selection Problem: 3
Fractional Knapsack: 4
Job Sequencing Problem: 5
Dynamic Programming: 6
Memoization (Introduction) : 8
Tabulation (Introduction) : 8
Longest Common Subsequence: 9
Memoization Solution: 11
Tabulation Solution: 12
Variations of LCS: 13
Coin Change (Count Combination): 14
Edit Distance Problem: 16
Longest Increasing Subsequence: 19
LIS using Binary Search in T.C. O(nlogn) and S.C. O(n): 21
Variations of LIS: 22
Maximum Sum Increasing Subsequence: 22
Maximum Length Bitonic Sequence: 23
Building Bridges: 24
Longest Chain of Pairs: 26
Maximum Cuts: 27
Minimum Coins to make a value: 30
Trees: 35
Height of Binary Tree: 36
Printing all nodes at k distance from the root node: 37
Prints Values Level-by-Level: 39
Prints Values Level-by-Level (Without NULL insertion) : 40
Find Size of a Binary Tree: 41
Maximum in Binary Tree: 43
Left View of a Binary Tree: 44
Method-1: 44
Method-2: 45
Children Sum Property: 47
Check for Balanced Tree: 48
Maximum Width of a Binary Tree: 49
Convert a Binary Tree to a DLL in-place: 50
Construct a Binary Tree from Inorder and Preorder: 51
Tree Traversal in Spiral Form: 51
Recursion: 53
Generating Subsets: 53
Number of Subsets with a given sum: 54
Printing all permutations of a string: 55
Graphs: 57
Graph Representation: 60
BFS: 68
First Version: 68
Second Version: 69
Count Connected Components in an Undirected Graph: 70
Applications of BFS: 72
DFS: 73
First Version (Undirected & Connected Graph): 73
Second Version (No source given & Disconnected Graph): 75
Count Connected Components in an Undirected Graph: 76
Applications of DFS: 77
Shortest Path in an Unweighted Graph: 78
Detect Cycles in an Undirected Graph: 80
Detect Cycles in an Directed Graph: 81
Greedy Algorithms:
Tabulation (Introduction) :
Here, since F(n) depends on previous 2 values, make sure when you start you have the
first 2 values already filled. (Here, F(0) and F(1))
For DP, first find out the recursive relation, then, if there is only one parameter that
changes between the recursive calls, then, we use a 1-D DP, for two parameters, we use
2-D DP and so on.
Because every value is calculated only once, this algorithm takes only O(mn) time.
So, worst case of Time Complexity of naive recursive solution is 2^n as there are 2^n
nodes in total and time taken for each node - LCS (s1, s2, k, j) is constant if already all
below it are known as calculating each node value just involves addition or comparison,
if we know all below values.
Tabulation Solution:
T.C. - O(mn)
In the tabulation method, the row number means how many characters of s1 are taken
(0, 1, 2, 3, 4) and column number means how many characters of s2 are taken (0, 1, 2, 3)
in the above example.
Count the number of combinations in which the infinite supply of coins can be taken in
order to sum to a given value.
Edit Distance Problem:
The recursive solution basically finds out the minimum number of operations for every
pair of substrings of both strings
This recursive solution has many overlapping sub-problems. So, we use DP:
Meaning of this DP table: dp[i][j] gives min. number of operations reqd. for the strings
“first i characters of string 1” and “first j characters of string 2”
Use the same recurrence relation as before, and understand what the recurrence
relation means in the dp table after adding the base cases.
For example, in this case, it means, if the character corresponding to (1,1) cell in both
strings is added {s1[i-1] == s2[j-1]} then just put the diagonal previous value in the dp in
this cell too.
Brute-Force: Find all subsequences (2^n) & see which are increasing
2^n because each element can be either included or not included (All sets in power set)
Main Idea of DP Solution:
Find Longest Increasing Subsequence ending at each element and find max of all of
them.
This taking max works because the LIS of the whole sequence has to have some
element as its ending point.
For finding LIS ending at each element: For every element, check the LIS value of
elements that are less than it to the left of it.
In Conclusion, for each element, set up a loop that loops all elements from start to that
element and check LIS value for all those elements that are less than this element, find
the maximum of all of them, add 1 to it.
(Maximum length sequence that (S) increases till a point and (S) decreases from there)
(Length of (S)trictly increasing sequence and of (S)trictly decreasing sequence could be
0 too)
Find LIS until each element and LDS until each element.
{To find LDS until each element, find LIS from the last element}
T.C. = O(n^2)
A better approach to find Maximum Length Bitonic Subsequence would be using Binary
Search and can be done in O(nlogn)
Building Bridges:
Array of pairs of cities are given, where the first element is on one side of a river and
the second element is on another side of the river.
Return the maximum number of bridges that you can build from the given list of
bridges while also making sure that no 2 bridges cross one another.
A noteworthy point here is that we are no longer trying to search for “subsequences”,
i.e., we no longer care about the relative order of the bridges, we just want to maximize
the number of bridges not letting them cross.
And this is the reason it is ok to sort also, i.e., even if relative ordering is lost, it is ok
because of this.
A noteworthy point here is that we are no longer trying to search for “subsequences”,
i.e., we no longer care about the relative order of the bridges, we just want to get the
longest chain of pairs with b < c.
And this is the reason it is ok to sort also, i.e., even if relative ordering is lost, it is ok
because of this.
Unlike in normal LIS, use “if (arr[i].first > arr[j].second)” for the DP based O(n^2)
solution.
However, this can be solved using the Binary Search Algorithm too in O(nlogn)
The di erence between this problem and the Building Bridges problem is that, in the
latter, a might be less than or equal to or greater than b. However, here though, we are
informed that a < b.
So, this can be thought of as a more specific version of the Building Bridges problem.
Maximum Cuts:
Cut the rod of length n into pieces of either size a or size b or size c
Worst T.C. = O(3^n) in the worst case, actual T.C. depends on a, b, c values
The recursion tree for it looks like below:
Check if it works for the corner case too for: Don't return res+1 even when the max of all
3 is -1 too.
Tabulation Solution:
We take a 1D array for DP as there is only one parameter changing across calls = value
Take a DP array of size: value+1
(Value+1 because, in case of value = 0, return 0)
Tabulation Solution:
Again check your code with the extreme case of: value = odd and all coins are even
Trees:
Same T.C. and S.C. for Postorder and Preorder Recursive Algos too.
Height of Binary Tree:
Again, worst case T.C. = O(n) and worst case S.C. = O(h)
Again, T.C. = O(n) and S.C. = Theta(Width) or S.C. = O(n) - Maximum possible width at the
lowest level = n/2 (Number of leaf nodes)
Find Size of a Binary Tree:
Note: In any of the Tree Traversal techniques, each node is visited exactly once.
We could also find the size of Binary Tree iteratively using any traversal like Level Order
Traversal. If Level Order Traversal’s case though the T.C. = O(n) again, but, S.C. =
theta(width).
Maximum in Binary Tree:
Method-1:
In preorder traversal, the first node you reach at every new level is a Left node.
T.C. = O(n) and S.C. = theta(h)
Method-2:
T.C. = O(n) and S.C. = theta(width)
Just print the leftmost node in every level (You know how to determine when a level
ends) - If i = 0, only then print the node value.
Children Sum Property:
Height Balanced Tree: For every node, the di erence between heights of left and right
subtrees must not di er by more than 1.
Maximum Width of a Binary Tree:
Just use level order traversal and find nodes at each level, find max of all.
Since we used Level Order Traversal only, T.C. = O(n) and S.C. = theta(width of the tree)
in-place => Tree nodes space should only be used for DLL nodes
Binary Tree can be uniquely constructed if Inorder and any other traversal is given:
Inorder & Preorder
Inorder & Postorder
Inorder & Level-Order
// Revise how to manually construct the tree from the 2 traversals given
In the code, we iterate through the preorder order, find where current element of
preorder is in inorder, then recurse on left side and on right side
Above takes O(n^2) in the worst case because of Searching. However we can make it
O(n) by using Hash Maps for Searching
Generating Subsets:
The same problem can be seen as the problem to generate all subsequences of a given
array or string
Number of Subsets with a given sum:
While calculating the max number of edges, we assume, parallel edges and self-loops
can’t exist.
Minimum number of edges is 0
A path is a special walk with no repetition of vertices allowed. (Walk is a sequence of
vertices). Sometimes however, a walk is called a path and a path is called a simple
path.
Graph Representation:
Both number to string mapping and string to number mapping are required for most
questions
For adding/removing vertex theta(V^2) time required as a new matrix of bigger size
should be constructed and all these values have to be copied there.
However, if you already have preallocated extra space, then won’t take V^2 time
though.
We are going to use Dynamic Sized Arrays in our implementation. We could have used
Linked Lists too.
Adjacency List Implementation:
A graph where the number of edges is near the maximum number of edges, is called a
Dense Graph and when it is much less than that, it is called Sparse Graph.
V+E better than V x V because - see comparison of E and V values on the right in above
picture for Directed and Undirected Graphs
NOTE:
● Finding all adjacent “from” u in Directed Graphs using List takes
theta(outdegree(u))
● Finding all adjacent “to” u in Directed Graphs using List takes theta(V+E)
traverse whole list
In conclusion, adjacency list is the most appropriate DS for sparse graphs, which are
the most practical graphs we see.
Finding all adjacent vertices is a vv. common operation we do almost in every graph
algorithm
BFS:
First Version:
BFS in Graphs is almost exactly same as the Level Order Traversal technique of trees
but with a visited array, so that we don’t O/P the same vertex more than once.
Note: Here, visited doesn’t mean that the vertex is printed, instead it means that
particular vertex was discovered and added to the queue once. {We don’t even know if
currently it is still in queue or not}
Second Version:
Here no source is given, and the graph may possibly be disconnected.
Our aim is to traverse through the whole graph (through all the nodes).
Note that here the same visited array is being used by all function calls
(bool visited [] = bool *visited is being passed)
BFS is an algorithm that first traverses all the nodes that are 1 edge away, then those
that are 2 edges away and so on. Since all edges are reached only once, i.e., the earliest
found path of a node is the shortest path to that node. So, BFS helps us in finding the
shortest path to a node in an unweighted graph. (i.e., when all edges have same
weight)
Infact, Prim’s and Kruskal’s too use same concept only, but they use priority queue
instead of queue and use edges to find weight.
DFS:
For a given graph there are many possible DFS outputs. In this course, we always
output the least node connected to a given node [Because, graph is stored in that order
only. i.e., for each node, the least node connected to it appears first, the next least
appears next, and so on]
Second Version (No source given & Disconnected Graph):
Could potentially be a disconnected graph. No source vertex is given. Our aim is to
print the whole graph.
What it basically does is: Take one of the vertices as a source, print all vertices
reachable from it. Take another non reached vertex as source, print all reachable and
so on.
Applications of DFS:
For solving maze or similar puzzles wherein the solution is one of all the leaf nodes -
DFS is the goto method used.
(All backtracking algorithms are based on DFS only)
Path finding can be done both by BFS and DFS, but, in DFS, print the path is very trivial
- Whenever the destination is reached print all vertices in the recursion call stack.
However, in BFS, printing all vertices in the queue is just a rubbish set of vertices.
Since the only extra work done compared to normal BFS is only maintaining the
distance array, the T.C. remains the same as no extra loops are used.
Hence T.C. = O(V+E)
Detect Cycles in an Undirected Graph:
It is the same code as DFS except for the for loop in the function “DFS”. That loop
inside DFS exists in order to cover the edge of graph being disconnected (See below
corner case)