Graph Algorithms I Keep Coming Back To
Why Graphs?
If you spend enough time doing competitive programming, you start to see graphs everywhere. A city road network, a dependency resolver, a social network, a game state space — they're all graphs under the hood. Getting comfortable with a core set of graph algorithms pays dividends both in contests and in production systems.
These are the ones I return to most often.
BFS and DFS — The Workhorses
Breadth-First Search and Depth-First Search are the foundation everything else builds on. BFS is your go-to for shortest path in unweighted graphs and for level-order traversals. DFS is more versatile — it powers cycle detection, topological sort, and is the backbone of more advanced algorithms like Tarjan's.
The key insight I had late: choose BFS when you care about distance, DFS when you care about structure.
from collections import deque
def bfs(graph, start):
visited = {start}
queue = deque([start])
while queue:
node = queue.popleft()
for neighbor in graph[node]:
if neighbor not in visited:
visited.add(neighbor)
queue.append(neighbor)
Dijkstra — Shortest Paths with Weights
Once you add non-negative edge weights, BFS no longer gives you shortest paths. Dijkstra fills that gap. The standard implementation uses a min-heap and runs in O((V + E) log V).
A mistake I made early on: using Dijkstra on graphs with negative edges. That's where Bellman-Ford or SPFA comes in. Know your constraints.
import heapq
def dijkstra(graph, start):
dist = {start: 0}
heap = [(0, start)]
while heap:
d, u = heapq.heappop(heap)
if d > dist.get(u, float('inf')):
continue
for v, w in graph[u]:
nd = d + w
if nd < dist.get(v, float('inf')):
dist[v] = nd
heapq.heappush(heap, (nd, v))
return dist
Union-Find — Connectivity in Near O(1)
Union-Find (Disjoint Set Union) is elegant. It answers "are these two nodes connected?" in nearly O(1) amortized time with path compression and union by rank. It's the core of Kruskal's MST algorithm and shows up constantly in dynamic connectivity problems.
class DSU:
def __init__(self, n):
self.parent = list(range(n))
self.rank = [0] * n
def find(self, x):
if self.parent[x] != x:
self.parent[x] = self.find(self.parent[x])
return self.parent[x]
def union(self, x, y):
px, py = self.find(x), self.find(y)
if px == py:
return False
if self.rank[px] < self.rank[py]:
px, py = py, px
self.parent[py] = px
if self.rank[px] == self.rank[py]:
self.rank[px] += 1
return True
Tarjan's SCC — Finding Strongly Connected Components
This one took me a while to internalize. Tarjan's algorithm finds all Strongly Connected Components in a directed graph in a single O(V + E) DFS pass. It's useful for condensing a graph into a DAG, which often simplifies otherwise intractable problems.
The key data structure is the stack + the lowlink array that tracks the earliest reachable ancestor during DFS.
Closing Thought
The real skill in competitive programming isn't memorizing these algorithms — it's pattern matching. Recognizing that "this problem is really just shortest path in a state space" or "this dependency problem reduces to topological sort" is what separates fast solvers from slow ones. The code is almost secondary.
Build the mental library. The implementations follow naturally.