Chapter III · The patterns of problem-solving

Algorithm Patterns

There are not a thousand kinds of problems—there are about fifteen patterns, dressed up a thousand ways. Learn to recognise the costume, and every problem becomes one you've already seen.

The secret nobody tells you: interviewers reuse the same patterns. Once you can read a problem and feel your brain whisper "this is a sliding window" or "this is a topological sort," you stop solving problems and start recognising them.

§ 1 · Two Pointers

Two Pointers

Two pointers is exactly what it sounds like—two indices walking through your data, usually a sorted array or string. They might start at opposite ends and squeeze toward each other, or both start at the front and move at different speeds. The technique replaces an O(n²) nested loop with an O(n) single sweep.

Imagine reading a long list of names sorted by height to find a pair that adds up to exactly two metres. A nested loop checks every pair—exhausting. With two pointers, one finger at the shortest name, one at the tallest: too tall? Move the right finger down. Too short? Move the left finger up. One pass, done.

When to recognise it

  • The array is sorted (or you can sort it) and you're looking for a pair/triple with a target sum.
  • You're checking a string for palindrome properties.
  • You're partitioning an array in place (e.g., moving zeros to the end).
  • The brute force is "check every pair" → O(n²). Two pointers gets you to O(n).

Template — opposite ends

// Two Sum on a SORTED array
public int[] twoSum(int[] nums, int target) {
    int l = 0, r = nums.length - 1;
    while (l < r) {
        int sum = nums[l] + nums[r];
        if (sum == target) return new int[]{l, r};
        if (sum < target) l++;
        else              r--;
    }
    return new int[]{-1, -1};
}

Template — same direction (fast/slow)

// Remove duplicates from a sorted array in place
public int removeDuplicates(int[] nums) {
    int slow = 0;
    for (int fast = 1; fast < nums.length; fast++) {
        if (nums[fast] != nums[slow]) {
            slow++;
            nums[slow] = nums[fast];
        }
    }
    return slow + 1;
}
§ 2 · Sliding Window

Sliding Window

A sliding window is two pointers that maintain a "window" of elements between them, expanding the right edge to include more and shrinking the left edge when the window breaks a constraint. It transforms "look at every contiguous subarray" (O(n²) or worse) into a single linear sweep.

Picture pushing a small rectangular magnifying glass across a row of letters, growing it whenever you can, shrinking it when you have to. You never go back. The information about "what's currently in the window" updates as the glass moves—and your answer is the best window you've seen.

When to recognise it

  • Problem mentions a contiguous subarray or substring.
  • You're tracking something inside a window — max, sum, distinct chars, frequency.
  • "Longest / shortest / count of subarrays with property X."
  • Brute force is O(n²) checking every (i, j) pair → window collapses it to O(n).

Template — variable-size window

// Longest substring with at most K distinct characters
public int longestKDistinct(String s, int k) {
    Map<Character, Integer> freq = new HashMap<>();
    int left = 0, best = 0;

    for (int right = 0; right < s.length(); right++) {
        char c = s.charAt(right);
        freq.put(c, freq.getOrDefault(c, 0) + 1);

        // Shrink until the window is valid again
        while (freq.size() > k) {
            char lc = s.charAt(left);
            freq.put(lc, freq.get(lc) - 1);
            if (freq.get(lc) == 0) freq.remove(lc);
            left++;
        }
        best = Math.max(best, right - left + 1);
    }
    return best;
}

Template — fixed-size window

// Max sum of any K consecutive elements
public int maxSumK(int[] nums, int k) {
    int sum = 0;
    for (int i = 0; i < k; i++) sum += nums[i];
    int best = sum;
    for (int i = k; i < nums.length; i++) {
        sum += nums[i] - nums[i - k];    // slide
        best = Math.max(best, sum);
    }
    return best;
}
§ 3 · Prefix Sum

Prefix Sum

A prefix-sum array stores running totals: prefix[i] = sum of nums[0..i-1]. Once you have it, the sum of any subarray [l..r] is just prefix[r+1] - prefix[l]—a single subtraction instead of looping. It's the world's most underrated trick.

Imagine a ruler. The distance between the 13 cm mark and the 27 cm mark? You don't measure the gap—you do 27 − 13 = 14. A prefix sum turns "what's the sum between positions L and R?" into the exact same subtraction.

Template

int[] prefix = new int[nums.length + 1];
for (int i = 0; i < nums.length; i++)
    prefix[i + 1] = prefix[i] + nums[i];

// sum of nums[l..r] inclusive:
int sum = prefix[r + 1] - prefix[l];

The hashmap upgrade — "subarray sum equals K"

Combine prefix sums with a hashmap and you can count subarrays with a specific sum in O(n):

public int subarraySum(int[] nums, int k) {
    Map<Integer, Integer> seen = new HashMap<>();
    seen.put(0, 1);                  // empty prefix
    int sum = 0, count = 0;
    for (int n : nums) {
        sum += n;
        count += seen.getOrDefault(sum - k, 0);
        seen.put(sum, seen.getOrDefault(sum, 0) + 1);
    }
    return count;
}
§ 5 · Sorting

Sorting

For interview purposes, you rarely implement a sort by hand — Java's built-in Arrays.sort uses Dual-Pivot Quicksort for primitives and Timsort for objects, both excellent. What matters is knowing when to sort, the cost (O(n log n)), and how to customise comparison.

Sorting is unloading a deck of cards into a sorted hand. Bubble sort swaps adjacent cards endlessly—slow. Merge sort splits the deck, sorts each half, then merges them—elegant. Quicksort picks a pivot card and partitions everything else into "smaller" and "bigger" piles, then recurses—fast on average. You don't need to write them. You need to know which costs what.

Quick reference

AlgorithmAvgWorstSpaceStable?
Bubble / InsertionO(n²)O(n²)O(1)Yes
Merge SortO(n log n)O(n log n)O(n)Yes
Quick SortO(n log n)O(n²)O(log n)No
Heap SortO(n log n)O(n log n)O(1)No
Counting SortO(n + k)O(n + k)O(k)Yes

Sorting in Java

// Primitives — ascending only
int[] nums = {3, 1, 4, 1, 5};
Arrays.sort(nums);

// Objects with custom comparator
Integer[] arr = {3, 1, 4};
Arrays.sort(arr, (a, b) -> b - a);    // descending

// Sort a List
List<String> list = new ArrayList<>(List.of("banana", "apple"));
Collections.sort(list);

// Sort an int[][] by second column
int[][] pairs = {{1, 3}, {2, 1}};
Arrays.sort(pairs, (a, b) -> a[1] - b[1]);

// Multi-key sort
Arrays.sort(pairs, (a, b) -> {
    if (a[0] != b[0]) return a[0] - b[0];
    return a[1] - b[1];
});
§ 6 · Recursion

Recursion

Recursion is a function that solves a problem by calling itself on a smaller piece of the same problem. The art is in two parts: the base case (when do we stop?) and the recursive case (how does this problem reduce to a smaller version?). Every tree problem, every backtracking problem, every divide-and-conquer algorithm is recursion in disguise.

Russian nesting dolls. To "open all the dolls," you open the outer doll and then — for the smaller doll inside — repeat the same process. You don't need a separate procedure for the inner doll. The base case: when the doll has nothing inside, stop.

The three questions to ask

  1. What's the base case? The smallest input where the answer is obvious without further recursion.
  2. What's the recurrence? Assuming the function works for inputs smaller than n, how do I build the answer for n?
  3. Does it terminate? Does every recursive call move toward the base case?

Template

// Factorial — the canonical example
public long factorial(int n) {
    if (n <= 1) return 1;          // base case
    return n * factorial(n - 1);    // recurse
}

// Tree height
public int height(TreeNode n) {
    if (n == null) return 0;
    return 1 + Math.max(height(n.left), height(n.right));
}
Trust the recursion
The hardest thing about recursion isn't writing it—it's believing the recursive call works without tracing through it. When you call height(n.left), assume it correctly returns the height of the left subtree. Don't trace it. Build the answer for the current node and let recursion handle the rest.
§ 7 · Backtracking

Backtracking

Backtracking is recursion that tries every option, undoes the bad ones, and remembers the good ones. You make a choice, recurse, then "un-make" the choice so you can try the next one. It's how you generate all permutations, all subsets, solve sudoku, place N queens, navigate mazes. Anytime the problem says "find all..." or "is there a way to...", backtracking is on the table.

Imagine you're in a maze with breadcrumbs. At each fork you pick a direction, drop a breadcrumb, and walk. If you hit a dead end, you walk back to the last fork, pick up the breadcrumb (so the path is "clean"), and try a different direction. Backtracking is just that—make a choice, explore, undo, try the next.

The universal template

void backtrack(state, choices) {
    if (isGoal(state)) {
        result.add(new ArrayList<>(state));   // snapshot!
        return;
    }
    for (choice : choices) {
        if (!isValid(choice, state)) continue;
        state.add(choice);                // make choice
        backtrack(state, choices);        // explore
        state.remove(state.size() - 1);   // UNDO
    }
}

Example — generate all subsets

public List<List<Integer>> subsets(int[] nums) {
    List<List<Integer>> res = new ArrayList<>();
    backtrack(0, nums, new ArrayList<>(), res);
    return res;
}

private void backtrack(int start, int[] nums,
                       List<Integer> cur, List<List<Integer>> res) {
    res.add(new ArrayList<>(cur));        // every state is an answer
    for (int i = start; i < nums.length; i++) {
        cur.add(nums[i]);                    // choose
        backtrack(i + 1, nums, cur, res);     // explore
        cur.remove(cur.size() - 1);          // un-choose
    }
}
Always add a copy
Notice new ArrayList<>(cur). If you add cur directly, every entry in res would be the same reference that keeps getting mutated. You'd end up with a list of empty lists. Always snapshot.
§ 8 · BFS

Breadth-First Search

BFS explores a graph or tree level by level, like ripples spreading from a pebble. Because it reaches everything one step away before going two steps, BFS is the natural choice for shortest path in an unweighted graph and for any "minimum number of moves" question.

Picture a rumour spreading through a school. At minute 1, only your best friend knows. At minute 2, your friend's friends know. At minute 3, their friends know. Whoever you ask "how did you hear?", the chain of telling is the shortest one. BFS computes exactly that—the fewest hops from the source.

When to recognise it

  • "Shortest path / fewest moves / minimum steps" in an unweighted graph.
  • "Level-order" traversal of a tree.
  • Grid problems where each cell connects to its 4 neighbours.
  • Multi-source spreading (rotting oranges, walls and gates).

Template — grid BFS

int[][] DIRS = {{-1,0}, {1,0}, {0,-1}, {0,1}};

public int shortestPath(int[][] grid, int[] start, int[] end) {
    int R = grid.length, C = grid[0].length;
    boolean[][] seen = new boolean[R][C];
    Queue<int[]> q = new ArrayDeque<>();
    q.offer(new int[]{start[0], start[1]});
    seen[start[0]][start[1]] = true;
    int steps = 0;

    while (!q.isEmpty()) {
        int sz = q.size();
        for (int i = 0; i < sz; i++) {
            int[] cur = q.poll();
            if (cur[0] == end[0] && cur[1] == end[1])
                return steps;
            for (int[] d : DIRS) {
                int r = cur[0] + d[0], c = cur[1] + d[1];
                if (r < 0 || r >= R || c < 0 || c >= C) continue;
                if (seen[r][c] || grid[r][c] == 1) continue;
                seen[r][c] = true;
                q.offer(new int[]{r, c});
            }
        }
        steps++;
    }
    return -1;
}
§ 9 · DFS

Depth-First Search

DFS dives as deep as possible before backing up. If BFS is ripples spreading evenly, DFS is a single explorer plunging into one branch until it hits a wall, then backing up to try the next branch. It's the workhorse of tree problems, connectivity questions, and path-finding when you just need any path, not the shortest.

Picture exploring a museum with a rule: at every fork, always take the leftmost unvisited door. You'll go deep into one wing, hit dead ends, back out, take the next door, repeat. Eventually you've seen every room. That single-minded plunge-and-backtrack is DFS.

Recursive template

void dfs(int node, List<List<Integer>> graph, boolean[] seen) {
    if (seen[node]) return;
    seen[node] = true;
    // process node
    for (int nb : graph.get(node)) dfs(nb, graph, seen);
}

Iterative template (with a stack)

void dfsIter(int start, List<List<Integer>> graph) {
    Deque<Integer> stack = new ArrayDeque<>();
    boolean[] seen = new boolean[graph.size()];
    stack.push(start);
    while (!stack.isEmpty()) {
        int node = stack.pop();
        if (seen[node]) continue;
        seen[node] = true;
        for (int nb : graph.get(node))
            if (!seen[nb]) stack.push(nb);
    }
}
BFS or DFS?
Need the shortest path / fewest moves → BFS. Need to enumerate paths, detect cycles, count connected components, or do anything with backtracking → DFS. When both work, DFS is usually shorter to write because recursion handles the stack for you.
§ 10 · Topological Sort

Topological Sort

Imagine you're scheduling university courses where some have prerequisites. CS201 requires CS101. CS301 requires CS201. A topological sort gives you a valid order to take everything—an order that never breaks a dependency. It only works on directed acyclic graphs (DAGs); if there's a cycle, no valid order exists.

Topo sort is the order you'd put on clothes. Socks before shoes. Shirt before jacket. Many valid orders exist—socks-shirt-shoes-jacket or shirt-socks-jacket-shoes—but you can never wear shoes before socks. Toposort produces any order that respects every "before" constraint.

Kahn's algorithm — BFS with in-degree counts

public int[] topoSort(int n, int[][] prereqs) {
    List<List<Integer>> graph = new ArrayList<>();
    for (int i = 0; i < n; i++) graph.add(new ArrayList<>());
    int[] inDeg = new int[n];

    for (int[] p : prereqs) {
        graph.get(p[1]).add(p[0]);   // p[1] → p[0]
        inDeg[p[0]]++;
    }

    Queue<Integer> q = new ArrayDeque<>();
    for (int i = 0; i < n; i++)
        if (inDeg[i] == 0) q.offer(i);

    int[] order = new int[n];
    int idx = 0;
    while (!q.isEmpty()) {
        int node = q.poll();
        order[idx++] = node;
        for (int nb : graph.get(node))
            if (--inDeg[nb] == 0) q.offer(nb);
    }
    return idx == n ? order : new int[0];  // empty = cycle
}
§ 11 · Dijkstra's Algorithm

Dijkstra — Weighted Shortest Path

BFS finds the shortest path when every edge costs 1. The moment edges have different weights—roads with tolls, flights with fares, networks with latencies—BFS breaks. Dijkstra's algorithm fixes this by always expanding the cheapest-so-far node next, using a min-heap. It's BFS with a priority queue. Caveat: all edge weights must be non-negative.

A waterfall flooding a landscape. Water always fills the lowest unfilled point first—not by order of discovery, but by elevation. Dijkstra is exactly that: at each step, expand the node with the smallest total distance found so far. The first time we settle a node, that distance is optimal.

Java template

public int[] dijkstra(int n, List<int[]>[] graph, int src) {
    // graph[u] contains {v, weight} pairs
    int[] dist = new int[n];
    Arrays.fill(dist, Integer.MAX_VALUE);
    dist[src] = 0;

    PriorityQueue<int[]> pq = new PriorityQueue<>((a, b) -> a[1] - b[1]);
    pq.offer(new int[]{src, 0});

    while (!pq.isEmpty()) {
        int[] cur = pq.poll();
        int u = cur[0], d = cur[1];
        if (d > dist[u]) continue;    // stale entry
        for (int[] edge : graph[u]) {
            int v = edge[0], w = edge[1];
            if (d + w < dist[v]) {
                dist[v] = d + w;
                pq.offer(new int[]{v, dist[v]});
            }
        }
    }
    return dist;
}

Complexity: O((V + E) log V) with a binary heap—fast enough for almost everything.

§ 12 · Greedy

Greedy

A greedy algorithm makes the locally best choice at every step and hopes it adds up to the globally best answer. Often it does. Often it doesn't. The art is knowing when. Greedy is fast when it works—usually O(n log n) including the sort—but proving it's correct on a given problem is the actual challenge.

Imagine paying ₹87 with the fewest coins. With Indian coins {₹1, ₹2, ₹5, ₹10}, greedy works: take as many ₹10s as fit, then ₹5s, then ₹2s, then ₹1s. But on a weird currency like {₹1, ₹3, ₹4}, greedy fails — for ₹6 it grabs ₹4 + ₹1 + ₹1 (three coins), missing the better answer ₹3 + ₹3 (two coins). Greedy is only as good as the problem allows.

Telltale signs greedy might work

  • Sorting unlocks structure — sort by deadline, by start time, by ratio.
  • "Pick maximum number of X" or "minimum cost to do Y."
  • Interval problems — selecting / merging / scheduling.
  • You can argue an exchange: swapping a non-greedy choice for the greedy one never makes things worse.

Classic example — interval scheduling

// Maximum non-overlapping meetings — sort by END time, pick earliest-ending
public int maxMeetings(int[][] meetings) {
    Arrays.sort(meetings, (a, b) -> a[1] - b[1]);
    int count = 0, lastEnd = Integer.MIN_VALUE;
    for (int[] m : meetings) {
        if (m[0] >= lastEnd) {
            count++;
            lastEnd = m[1];
        }
    }
    return count;
}
§ 13 · Dynamic Programming

Dynamic Programming

Dynamic programming is recursion that remembers. The same subproblem comes up again and again; instead of resolving it, you cache the answer the first time and reuse it. That's the whole idea. The hard part isn't coding—it's defining the right "state" so that the answer for state S can be built from the answers of smaller states.

Imagine climbing stairs where each step you can take 1 or 2 stairs. How many ways to reach step N? You realise the number of ways to reach step N is just (ways to reach N-1) + (ways to reach N-2)—because you got there either by a 1-step or a 2-step. Once you compute "ways(5)" once, you'd be foolish to compute it again. Write it down. Reuse it. That's DP.

The DP method — three steps

  1. Define the state. What does dp[i] (or dp[i][j]) mean? Write it out as an English sentence: "the maximum sum of a subarray ending at index i."
  2. Write the recurrence. How does the answer for state S build from smaller states?
  3. Identify base cases. The smallest states where the answer is direct.

Top-down (memoization)

int[] memo;

public int climbStairs(int n) {
    memo = new int[n + 1];
    return solve(n);
}
private int solve(int n) {
    if (n <= 2) return n;
    if (memo[n] != 0) return memo[n];
    return memo[n] = solve(n - 1) + solve(n - 2);
}

Bottom-up (tabulation)

public int climbStairs(int n) {
    if (n <= 2) return n;
    int[] dp = new int[n + 1];
    dp[1] = 1; dp[2] = 2;
    for (int i = 3; i <= n; i++)
        dp[i] = dp[i - 1] + dp[i - 2];
    return dp[n];
}

The DP family — patterns you'll see again and again

  • 1D DP — Climbing Stairs, House Robber, Max Subarray (Kadane).
  • 0/1 Knapsack — Pick or skip each item. Subset Sum, Partition Equal Subset.
  • Unbounded Knapsack — Use each item infinite times. Coin Change.
  • Longest Increasing Subsequence (LIS) — O(n²) DP or O(n log n) patience-sort.
  • Longest Common Subsequence (LCS) — 2D DP on two strings.
  • Edit Distance — 2D DP, the canonical string-comparison problem.
  • Matrix DP — Unique Paths, Min Path Sum on a grid.
  • Interval DP — Burst Balloons, Matrix Chain Multiplication.

Example — 0/1 Knapsack template

public int knapsack(int[] wt, int[] val, int cap) {
    int n = wt.length;
    int[][] dp = new int[n + 1][cap + 1];
    for (int i = 1; i <= n; i++) {
        for (int w = 0; w <= cap; w++) {
            dp[i][w] = dp[i - 1][w];                       // skip item
            if (w >= wt[i - 1])
                dp[i][w] = Math.max(dp[i][w],
                          dp[i - 1][w - wt[i - 1]] + val[i - 1]);
        }
    }
    return dp[n][cap];
}
How to start a DP problem
Always begin with the recursive solution. Forget optimization. Write the brute recursion that just tries every option. Once it works on small inputs, look at the parameters of the recursive function — those are your DP state. Add memoization with a HashMap or 2D array, and you're done. Bottom-up is just a polish step.
§ 14 · Bit Manipulation

Bit Manipulation

Every integer is just a string of 0s and 1s. Bit manipulation works at that level directly—setting, clearing, flipping, or counting individual bits. The payoff: operations that would take a loop in normal code become single instructions. Used cleverly, they replace HashSets with a single integer.

Imagine 32 light switches in a row. You can turn any one on or off, ask "is switch 7 on?", flip them all at once. That's bit manipulation: each switch is a bit, and clever combinations of AND, OR, XOR, and shifts let you query and modify the whole row in one operation.

The operations to memorise

// AND  &     OR  |     XOR  ^     NOT  ~     Left  <<     Right  >>

int n = 12;        // binary: 1100

n & 1;             // 0 — is last bit set?
n | (1 << 3);      // set bit 3
n & ~(1 << 2);     // clear bit 2
n ^ (1 << 2);      // flip bit 2
(n >> k) & 1;       // get value of bit k (0 or 1)

// XOR magic: x ^ x == 0  and  x ^ 0 == x
// → find the unique number in {a, a, b, b, c, c, d} : XOR them all

// Useful built-ins
Integer.bitCount(n);            // number of 1-bits
Integer.toBinaryString(n);
Integer.numberOfTrailingZeros(n);

// n & (n - 1) clears the LOWEST set bit — count bits in O(set-bits)
while (n != 0) {
    n &= (n - 1);
    count++;
}
§ 15 · Monotonic Stack

Monotonic Stack

A monotonic stack is a stack where you maintain an invariant: elements inside it stay in increasing (or decreasing) order. Whenever pushing a new element would break the order, you pop the offending elements first. Sounds niche, but it's the secret to "next greater element," "largest rectangle in histogram," and "daily temperatures" problems—all in O(n).

Think of stacking books from tallest to shortest. When a new book is taller than the one on top, the top book has to come off — and since the original stack was ordered, every book about to be popped knows its "next taller neighbour" is the new book. That's the magic: as elements get popped, you learn their answer.

Template — next greater element

public int[] nextGreater(int[] nums) {
    int n = nums.length;
    int[] res = new int[n];
    Arrays.fill(res, -1);
    Deque<Integer> stack = new ArrayDeque<>();   // indices
    for (int i = 0; i < n; i++) {
        while (!stack.isEmpty() && nums[stack.peek()] < nums[i]) {
            res[stack.pop()] = nums[i];
        }
        stack.push(i);
    }
    return res;
}
§ 16 · The Recognition Playbook

The recognition playbook

When a new problem arrives, walk this checklist. The first match is usually the right pattern.

If the problem mentions…Reach for…
Sorted array, target sum, pair / tripletTwo Pointers
Contiguous subarray / substring, "longest / smallest"Sliding Window
Many range-sum queries on a static arrayPrefix Sum
Sorted input, or "find / search / minimum X such that…"Binary Search (sometimes "on the answer")
"All permutations / subsets / combinations / ways"Backtracking
"Top K" / "Kth smallest / largest"Heap (PriorityQueue)
Tree problem, anythingRecursion (DFS), sometimes BFS for levels
Shortest path in unweighted graph / gridBFS
Shortest path with non-negative weightsDijkstra
Connected components / "are X and Y connected?"Union-Find or DFS
Dependencies / ordering / prerequisitesTopological Sort
"Count / minimum / maximum number of ways"Dynamic Programming
Two strings → compare / transform / match2D DP (LCS, Edit Distance pattern)
Pick items with constraintsKnapsack DP
"Next greater / smaller element"Monotonic Stack
Prefix matching / autocompleteTrie
"Find unique number" with bit-pair-up tricksXOR / Bit Manipulation
Locally-best choice each step works outGreedy (always verify with a test case)

Patterns in hand. Now sharpen the language you'll write them in.

Open the Java Toolkit