Merge Sort

March 31, 2026
#merge-sort #sorting #algorithms #divide-and-conquer #data-structures #interviewing

Introduction

Merge Sort is a divide-and-conquer sorting algorithm that splits an array in half, recursively sorts each half, then merges the two sorted halves back together. It’s like sorting a deck of cards by splitting it into smaller and smaller piles until each pile has one card, then merging piles together in order.

Unlike Quick Sort, which can degrade to O(n²) with bad pivot choices, Merge Sort guarantees O(n log n) in every case. The tradeoff is that it needs O(n) extra space. You’ll want to be comfortable with Recursion, Arrays, and Big-O Notation before diving in.

How Merge Sort works

  1. Divide: Split the array into two halves
  2. Conquer: Recursively sort each half
  3. Combine: Merge the two sorted halves into one sorted array

Visual example

[38, 27, 43, 3, 9, 82, 10]

Split:
[38, 27, 43]          [3, 9, 82, 10]
[38]  [27, 43]        [3, 9]  [82, 10]
[38]  [27] [43]       [3] [9] [82] [10]

Merge back:
[38]  [27, 43]        [3, 9]  [10, 82]
[27, 38, 43]          [3, 9, 10, 82]

[3, 9, 10, 27, 38, 43, 82]

The key insight: merging two sorted arrays into one sorted array is easy and takes O(n) time. You just compare the front elements and pick the smaller one.

Implementation

In-place Merge Sort

The version above creates new arrays at every level. An in-place variant uses indices and a shared temporary buffer:

This still uses O(n) extra space for the temp buffer, but it allocates it once instead of at every recursive call.

Why the merge step works

The merge step is the heart of the algorithm. It works because both halves are already sorted:

Left:  [3, 27, 38]    Right: [9, 10, 43, 82]
        ^                      ^
Compare 3 vs 9 → take 3

Left:  [3, 27, 38]    Right: [9, 10, 43, 82]
           ^                   ^
Compare 27 vs 9 → take 9

Left:  [3, 27, 38]    Right: [9, 10, 43, 82]
           ^                      ^
Compare 27 vs 10 → take 10

Left:  [3, 27, 38]    Right: [9, 10, 43, 82]
           ^                          ^
Compare 27 vs 43 → take 27

...and so on → [3, 9, 10, 27, 38, 43, 82]

Each element is compared and placed exactly once, so the merge step is O(n).

Complexity analysis

Case Time Space
Best O(n log n) O(n)
Average O(n log n) O(n)
Worst O(n log n) O(n)

Why always O(n log n)? The array is always split exactly in half (unlike Quick Sort where the split depends on the pivot). This gives exactly log n levels. Each level does O(n) work merging. So it’s always n × log n — no exceptions.

Space: O(n) for the temporary arrays used during merging, plus O(log n) for the recursion stack.

Stability

Merge Sort is stable — equal elements maintain their original relative order. This matters when sorting objects by multiple criteria:

// Sorting students by grade, then by name
// Stable sort preserves name order within same grade
const students = [
  { name: 'Alice', grade: 'B' },
  { name: 'Bob', grade: 'A' },
  { name: 'Charlie', grade: 'B' },
];

// After stable sort by grade:
// Bob (A), Alice (B), Charlie (B)
// Alice still comes before Charlie — original order preserved

The stability comes from the <= in the merge step. When elements are equal, we take from the left half first, preserving original order.

When to use Merge Sort

  • You need guaranteed O(n log n) — no worst-case surprises
  • You need a stable sort
  • You’re sorting linked lists — Merge Sort is ideal because it doesn’t need random access
  • You’re sorting external data (files too large for memory) — merge naturally works on chunks

When to use Quick Sort instead:

  • Sorting arrays in memory where cache performance matters
  • Space is constrained (Quick Sort is in-place)
  • Average-case speed matters more than worst-case guarantees

Practice problems

  1. Sort an Array (implement Merge Sort from scratch)
  2. Count of Smaller Numbers After Self (modified merge sort)
  3. Reverse Pairs (count inversions using merge sort)
  4. Sort a Linked List
  5. Merge k Sorted Lists

The merge operation itself is a fundamental building block. Once you’re comfortable with it, problems involving merging sorted sequences, counting inversions, and external sorting become much more approachable.

Thanks for visiting
We are actively updating content to this site. Thanks for visiting! Please bookmark this page and visit again soon.
Sponsor