Kadane’s Algorithm
- Used to find the maximum subarray sum in an array
- Time complexity:
def maxSubArray(nums):
maxSum = nums[0]
curSum = nums[0]
for i in range(1, len(nums)):
curSum = max(nums[i], curSum + nums[i])
maxSum = max(maxSum, curSum)
return maxSumExplanation
Imagine a Treasure Hunt 🏰
Think of the array nums as a series of treasures, where each number represents
the value of a treasure. Your goal is to find the most valuable contiguous
sequence of treasures.
Initialize the Quest 🎒
- Start your quest with the first treasure (
nums) as both your current treasure (curSum) and the maximum treasure (maxSum) you’ve found so far.
Explore the Treasures 🔍
- Move through the treasures one by one, starting from the second treasure
(
i = 1). - At each treasure, you have two options:
- Start a new sequence with the current treasure (
nums[i]). - Continue the existing sequence by adding the current treasure to your
current treasure (
curSum + nums[i]).
- Start a new sequence with the current treasure (
- Choose the option that gives you the most valuable current treasure
(
curSum).
Update the Maximum Treasure 💰
- After each step, compare your current treasure (
curSum) with the maximum treasure (maxSum) you’ve found so far. - If the current treasure is more valuable, update the maximum treasure to be the current treasure.
Complete the Quest 🎉
- Once you’ve explored all the treasures, the maximum treasure (
maxSum) you’ve found is the most valuable contiguous sequence of treasures!
Remember:
curSumrepresents your current treasure (the sum of the current subarray).maxSumkeeps track of the maximum treasure (the maximum subarray sum) you’ve found so far.- At each step, you decide whether to start a new sequence or continue the existing one, choosing the option that gives you the most valuable current treasure.
- Update the maximum treasure whenever the current treasure becomes more valuable.
Subsets
- Usually used for backtracking problems
- Order does not matter
Example problem: create all possible subarrays of a given array
- For each value, we have a choice — either include in the subset or not
- This is a decision tree problem
- Number of subsets: , where is the number of input values
- Time complexity: , space complexity:
// Time: O(n * 2^n), Space: O(n)
function subsetsWithoutDuplicates(nums) {
let subsets = [];
let curSet = [];
helper(0, nums, curSet, subsets);
return subsets;
}
function helper(i, nums, curSet, subsets) {
if (i >= nums.length) {
subsets.push([...curSet]);
return;
}
// decision to include nums[i]
curSet.push(nums[i]);
helper(i + 1, nums, curSet, subsets);
curSet.pop();
// decision NOT to include nums[i]
helper(i + 1, nums, curSet, subsets);
}Modified example: array has duplicates
// Time: O(n * 2^n), Space: O(n)
function subsetsWithDuplicates(nums) {
nums.sort();
let curSet = [];
let subsets = [];
helper2(0, nums, curSet, subsets);
return subsets;
}
function helper2(i, nums, curSet, subsets) {
if (i >= nums.length) {
subsets.push([...curSet]);
return;
}
// decision to include nums[i]
curSet.push(nums[i]);
helper2(i + 1, nums, curSet, subsets);
curSet.pop();
// decision NOT to include nums[i]
while (i + 1 < nums.length && nums[i] == nums[i + 1]) {
i++;
}
helper2(i + 1, nums, curSet, subsets);
}Combinations
- Almost exactly like subsets
- Order does not matter
Example problem: given two nums n & k, return all possible combinations of size=k, choosing from values between 1 and n.
// Time: O(k * 2^n)
function combinations(n, k) {
let combs = [];
helper(1, [], combs, n, k);
return combs;
}
function helper(i, curComb, combs, n, k) {
if (curComb.length == k) {
combs.push([...curComb]);
return;
}
if (i > n) {
return;
}
// decision to include i
curComb.push(i);
helper(i + 1, curComb, combs, n, k);
curComb.pop();
// decision to NOT include i
helper(i + 1, curComb, combs, n, k);
}Modified solution
// Time: O(k * C(n, k))
// C(n, k) = n! / k! * (n-k)!
function combinations2(n, k) {
let combs = [];
helper2(1, [], combs, n, k);
return combs;
}
function helper2(i, curComb, combs, n, k) {
if (curComb.length == k) {
combs.push([...curComb]);
return;
}
if (i > n) {
return;
}
for (let j = i; j < n + 1; j++) {
curComb.push(j);
helper2(j + 1, curComb, combs, n, k);
curComb.pop();
}
}Explanation
Imagine you’re a party planner trying to create guest lists for a series of events.
-
The Grand Plan (Main Function)
- You have ‘n’ potential guests and need to create lists of ‘k’ people each
- You start with an empty list of guest combinations
-
The Guest List Maker (Helper Function)
- You have a special assistant (helper function) who helps you make decisions
-
Perfect List Check (Base Case 1)
- If your current list has exactly ‘k’ people, celebrate! Add it to your collection
-
Out of Options (Base Case 2)
- If you’ve considered more people than exist (i > n), stop and go back
-
The Invite Dilemma (Recursive Cases)
- For each person, you face a dilemma: invite them or not?
-
Send an Invite (Include Case)
- You decide to invite the person
- Add them to your current list
- Ask your assistant to continue planning with this person included
- If it doesn’t work out, remove them from the list (backtrack)
-
Skip the Invite (Exclude Case)
- You decide not to invite the person
- Ask your assistant to continue planning without this person
-
Repeat and Refine (Recursion)
- Your assistant repeats this process for each potential guest
Permutations
-
Order matters
-
Algorithm
- Go to the last stage, that’s one permutation
- Go back up, and insert this element at every possible position
- Repeat
Example problem: given a list of nums, return all possible distinct permutations of nums
## Time: O(n^2 * n!)
def permutationsRecursive(nums):
return helper(0, nums)
def helper(i, nums):
if i == len(nums):
return [[]]
resPerms = []
perms = helper(i + 1, nums)
for p in perms:
for j in range(len(p) + 1):
pCopy = p.copy()
pCopy.insert(j, nums[i])
resPerms.append(pCopy)
return resPerms
## Time: O(n^2 * n!)
def permutationsIterative(nums):
perms = [[]]
for n in nums:
nextPerms = []
for p in perms:
for i in range(len(p) + 1):
pCopy = p.copy()
pCopy.insert(i, n)
nextPerms.append(pCopy)
perms = nextPerms
return permsExplanation
Imagine you’re planning a party and deciding how to arrange your guests in a line. Here’s how this algorithm works:
-
Start with the Last Guest: Begin with the last person on your guest list. They can only stand in one way - by themselves.
-
Add One Guest at a Time: Moving backwards through your list, for each new guest:
- Look at all the ways the guests after them are already arranged.
- For each of these arrangements:
- Try putting the new guest in every possible spot.
- Remember each new arrangement you create.
-
Keep Going: Repeat step 2 until you’ve added all guests to your arrangements.
-
Party Time: Now you have all possible ways to arrange your guests!
Key Things to Remember:
- Start from the end of the list and work backwards.
- For each new person, try putting them in every possible spot in the existing arrangements.
- Build up from smaller arrangements to larger ones.
This “party planner” approach mimics how the algorithm recursively builds permutations. It starts with the simplest case (one person) and gradually builds up to the full list, considering all possible positions for each new element added.
The recursion in the algorithm is like you, the party planner, asking your assistant to handle arranging the guests after the current one. You then take their arrangements and figure out where to put the current guest in each of them.
This method ensures you find all possible arrangements (permutations) of your guest list, just like the algorithm does for any given list of elements.
Dijkstra’s (Shortest Path)
- BFS is not sufficient for finding shortest path in weighted graphs
- Dijkstra’s algorithm is used for finding the shortest path from a source node to all other nodes in a graph
Algorithm
Time complexity: , where is the number of vertices
class Solution:
# Implementation for Dijkstra's shortest path algorithm
def shortestPath(self, n: int, edges: List[List[int]], src: int) -> Dict[int, int]:
adj = {}
for i in range(n):
adj[i] = []
# s = src, d = dst, w = weight
for s, d, w in edges:
adj[s].append([d, w])
# Compute shortest paths
shortest = {}
minHeap = [[0, src]]
while minHeap:
w1, n1 = heapq.heappop(minHeap)
if n1 in shortest:
continue
shortest[n1] = w1
for n2, w2 in adj[n1]:
if n2 not in shortest:
heapq.heappush(minHeap, [w1 + w2, n2])
# Fill in missing nodes
for i in range(n):
if i not in shortest:
shortest[i] = -1
return shortestExplanation
Imagine Dijkstra running a package delivery service in a city:
-
Map the City (Setup)
- Dijkstra creates a map of the city (adj dictionary)
- He lists all possible destinations (range(n))
-
Plan Routes (Edge Processing)
- For each road (edge), he notes the destination and distance
- He organizes this info for easy access (adj[s].append([d, w]))
-
Start the Journey (Initialize)
- Dijkstra starts at the source location (src)
- He has a smart GPS (minHeap) that always suggests the shortest next step
-
Navigate the City (Main Loop)
- Dijkstra follows the GPS suggestions (heapq.heappop(minHeap))
- He marks each visited location with the distance traveled (shortest[n1] = w1)
-
Explore New Routes (Process Neighbors)
- At each stop, he looks for connected locations
- He updates his GPS with new possible routes (heapq.heappush(minHeap, [w1 + w2, n2]))
-
Avoid Revisits (Skip Visited Nodes)
- If he’s been to a location before, he doesn’t go again (if n1 in shortest: continue)
-
Record the Journey (Finalize Results)
- For places he couldn’t reach, he marks as unreachable (-1)
- He compiles a final report of shortest distances to all locations
Key Things to Remember:
- Always go to the closest unvisited place next.
- Keep updating distances if you find shorter routes.
- Once you visit a place, you’ve found the shortest path to it.
This “greedy explorer” approach ensures you always find the shortest path to each location, just like Dijkstra’s algorithm does for nodes in a graph. The algorithm is greedy because it always chooses the closest option, and it works because the shortest overall path must be made up of shortest sub-paths.
Prim’s (Minimum Spanning Tree)
- Prim’s algorithm is used to find the minimum spanning tree in an undirected graph
- A minimum spanning tree is a subset of the edges of a connected, edge-weighted graph that connects all the vertices together without any cycles and with the minimum possible total edge weight
- N-1 edges are required to connect N nodes
Algorithm
Time complexity: , where is the number of vertices
import heapq
def prim(graph):
start_vertex = next(iter(graph))
mst = []
visited = set([start_vertex])
edges = [
(cost, start_vertex, to)
for to, cost in graph[start_vertex].items()
]
heapq.heapify(edges)
while edges:
cost, frm, to = heapq.heappop(edges)
if to not in visited:
visited.add(to)
mst.append((frm, to, cost))
for next_to, next_cost in graph[to].items():
if next_to not in visited:
heapq.heappush(edges, (next_cost, to, next_to))
return mst
## Example usage:
graph = {
'A': {'B': 2, 'C': 3},
'B': {'A': 2, 'C': 1, 'D': 1},
'C': {'A': 3, 'B': 1, 'D': 4},
'D': {'B': 1, 'C': 4}
}
minimum_spanning_tree = prim(graph)
print("Minimum Spanning Tree:", minimum_spanning_tree)Explanation
Imagine you’re building a neighborhood where everyone wants to be connected in the most efficient way possible.
-
Choose a Starting House (Initial Vertex)
- Pick any house to start (it doesn’t matter which one)
-
Welcome Wagon (Visited Set)
- Keep track of who’s already in the neighborhood
-
Potential Connections (Min Heap of Edges)
- Maintain a list of possible connections, always sorted by cost
- Start with all connections from the first house
-
Growing the Neighborhood (Main Loop)
- While there are potential connections: a. Pick the cheapest connection (pop
from heap) b. If it reaches a new house:
- Welcome the new house to the neighborhood
- Add this connection to your neighborhood plan
- Look at all connections from this new house
- Add new potential connections to your sorted list
- While there are potential connections: a. Pick the cheapest connection (pop
from heap) b. If it reaches a new house:
-
Neighborhood Complete (MST Formed)
- Stop when every house is connected
Key Points to Remember:
- Always choose the cheapest available connection
- Only add connections to new houses
- Keep your list of potential connections sorted by cost
- The process naturally avoids cycles
Kruskal’s (Minimum Spanning Tree)
class DisjointSet:
def __init__(self, vertices):
self.parent = {v: v for v in vertices}
self.rank = {v: 0 for v in vertices}
def find(self, item):
if self.parent[item] != item:
self.parent[item] = self.find(self.parent[item])
return self.parent[item]
def union(self, x, y):
xroot = self.find(x)
yroot = self.find(y)
if self.rank[xroot] < self.rank[yroot]:
self.parent[xroot] = yroot
elif self.rank[xroot] > self.rank[yroot]:
self.parent[yroot] = xroot
else:
self.parent[yroot] = xroot
self.rank[xroot] += 1
def kruskal_mst(graph):
edges = [(weight, u, v) for u in graph for v, weight in graph[u].items()]
edges.sort()
vertices = list(graph.keys())
disjoint_set = DisjointSet(vertices)
mst = []
for weight, u, v in edges:
if disjoint_set.find(u) != disjoint_set.find(v):
disjoint_set.union(u, v)
mst.append((u, v, weight))
return mst
## Example usage
graph = {
'A': {'B': 4, 'C': 2},
'B': {'A': 4, 'C': 1, 'D': 5},
'C': {'A': 2, 'B': 1, 'D': 8, 'E': 10},
'D': {'B': 5, 'C': 8, 'E': 2, 'F': 6},
'E': {'C': 10, 'D': 2, 'F': 3},
'F': {'D': 6, 'E': 3}
}
minimum_spanning_tree = kruskal_mst(graph)
print("Edges in the Minimum Spanning Tree:")
for edge in minimum_spanning_tree:
print(f"{edge[0]} -- {edge[1]} : {edge[2]}")Explanation
Mnemonic: “Eagerly Find Minimum Connections”
Breakdown of the Mnemonic
- Eagerly - Edges: Start with all the edges in the graph.
- Find - Find the Minimum: Sort all the edges by their weights in ascending order.
- Minimum - Minimum Spanning Tree: Initialize an empty tree (MST) to store the selected edges.
- Connections - Connect components: Use a union-find structure to keep track of connected components and add edges to the MST only if they connect different components.
Steps to Remember
- List All Edges: Write down or visualize all the edges in the graph.
- Sort Edges: Sort the edges by weight (think of it as lining them up from lightest to heaviest).
- Initialize MST: Start with an empty MST.
- Iterate and Connect:
- Go through the sorted edges one by one.
- For each edge, check if it connects two different components using the union-find structure.
- If it does, add it to the MST and union the two components.
- Stop When Full: Continue until you have added edges (where V $$ is the number of vertices).
Visualization Technique
- Draw the Graph: Create a graph with vertices and weighted edges.
- Highlight the Edges: As you sort and select edges, highlight them in order of selection.
- Show Connections: Use different colors or lines to represent connected components as you add edges to the MST.
Example Summary
- Graph: A—4—B, A—2—C, B—1—C, B—5—D, C—8—D
- Sorted Edges: B—C (1), A—C (2), A—B (4), B—D (5), C—D (8)
- MST Steps:
- Add B—C (1)
- Add A—C (2)
- Add A—B (4) (can’t add B—D (5) as it connects the same component)
Final MST: B—C, A—C, A—B.
Topological Sort
- Used to find the order of tasks in a directed acyclic graph (DAG)
- Time complexity:
Algorithm
// Given a directed acyclical graph, return a valid
// topological ordering of the graph.
function topologicalSort(edges, n) {
// Create an adjacency list representation of the graph
let adj = new Map();
for (let i = 1; i < n + 1; i++) {
adj.set(i, new Array());
}
for (let edge of edges) {
let src = edge[0],
dst = edge[1];
adj.get(src).push(dst);
}
// Perform DFS traversal and store the result in topSort array
topSort = new Array();
visit = new Set();
for (let i = 1; i < n + 1; i++) {
dfs(i, adj, visit, topSort);
}
// Reverse the topSort array to get the correct ordering
topSort.reverse();
return topSort;
}
function dfs(src, adj, visit, topSort) {
if (visit.has(src)) {
return;
}
visit.add(src);
for (let neighbor of adj.get(src)) {
dfs(neighbor, adj, visit, topSort);
}
topSort.push(src);
return;
}Explanation
Here’s an easy way to remember topological sort:
Think of tasks and dependencies
Imagine you have a list of tasks to complete, but some tasks depend on others being finished first. For example:
- Task A: Bake a cake
- Task B: Frost the cake (depends on A)
- Task C: Add sprinkles (depends on B)
You can’t frost the cake until it’s baked, and you can’t add sprinkles until it’s frosted. The correct order to do the tasks is A, B, then C.
Topological sort finds this order
Topological sort takes a directed graph where edges represent dependencies between nodes (tasks) and produces a linear ordering such that:
- If there is an edge from node A to node B, then A appears before B in the ordering[1]
In other words, it puts the tasks in an order that satisfies all the dependencies.
Key points to remember:
- Topological sort only works on directed acyclic graphs (DAGs)
- If the graph has cycles, there is no valid ordering[1]
- Two main algorithms for topological sort:
- Kahn’s algorithm (BFS-based)
- DFS-based algorithm[4]
- In Kahn’s algorithm, start with nodes that have no incoming edges (in-degree of 0)[4]
- In DFS, the node that finishes last goes first in the ordering[3]
So in summary - when you have “tasks” (nodes) with “dependencies” (directed edges), think topological sort to find a valid ordering!