Divide and Conquer: Mergesort and Quicksort - StudyPulse
Boost Your VCE Scores Today with StudyPulse
8000+ Questions AI Tutor Help
Home Subjects Algorithmics (HESS) Divide and conquer (mergesort/quicksort)

Divide and Conquer: Mergesort and Quicksort

Algorithmics (HESS)
StudyPulse

Divide and Conquer: Mergesort and Quicksort

Algorithmics (HESS)
01 May 2026

Divide and Conquer: Mergesort and Quicksort

The Divide and Conquer Pattern

Divide and conquer solves a problem by:
1. Divide: Split the problem into smaller sub-problems
2. Conquer: Solve each sub-problem recursively (or directly if small enough)
3. Combine: Merge sub-solutions into the overall solution

For algorithms with linear divide and merge steps:

$$T(n) = aT(n/b) + O(n)$$

KEY TAKEAWAY: Mergesort always runs in $O(n \log n)$; Quicksort averages $O(n \log n)$ but has a $O(n^2)$ worst case. Both use the divide-and-conquer pattern.


Mergesort

Strategy

Split the array in half, recursively sort each half, merge the sorted halves.

Algorithm

ALGORITHM MergeSort(A)
    if length(A) <= 1 then return A   // base case
    mid  length(A) / 2
    left  MergeSort(A[0..mid-1])
    right  MergeSort(A[mid..n-1])
    return Merge(left, right)

ALGORITHM Merge(left, right)
    result  []
    i  0; j  0
    while i < length(left) AND j < length(right) do
        if left[i] <= right[j] then
            append left[i] to result; i  i + 1
        else
            append right[j] to result; j  j + 1
        end if
    end while
    append remaining elements of left or right
    return result

Complexity

  • Recurrence: $T(n) = 2T(n/2) + O(n)$
  • Master Theorem: $a=2, b=2, c=1$; $\log_b a = 1 = c$ → Case 2
  • Result: $T(n) = O(n \log n)$ in all cases (best, average, worst)

Space: $O(n)$ auxiliary (for merge step)


Quicksort

Strategy

Choose a pivot element, partition the array into elements ≤ pivot and elements > pivot, recursively sort each partition.

Algorithm

ALGORITHM QuickSort(A, low, high)
    if low < high then
        pivot_index  Partition(A, low, high)
        QuickSort(A, low, pivot_index - 1)
        QuickSort(A, pivot_index + 1, high)
    end if

ALGORITHM Partition(A, low, high)
    pivot  A[high]   // choose last element as pivot
    i  low - 1
    for j  low to high - 1 do
        if A[j] <= pivot then
            i  i + 1
            swap A[i] and A[j]
        end if
    end for
    swap A[i+1] and A[high]
    return i + 1

Complexity

Case Condition Recurrence Result
Average Random data $T(n) = 2T(n/2) + O(n)$ $O(n \log n)$
Worst Sorted or reverse-sorted; pivot always min/max $T(n) = T(n-1) + O(n)$ $O(n^2)$
Best Pivot always median $T(n) = 2T(n/2) + O(n)$ $O(n \log n)$

Space: $O(\log n)$ average (call stack depth); $O(n)$ worst case


Comparison

Property Mergesort Quicksort
Time (worst) $O(n \log n)$ $O(n^2)$
Time (average) $O(n \log n)$ $O(n \log n)$
Space $O(n)$ $O(\log n)$ average
Stable? Yes No (standard)
In-place? No Yes
Cache performance Worse Better (locality)

EXAM TIP: Know how to trace Merge (show the merge step comparing elements from two sorted halves) and Partition (show how elements are rearranged around the pivot). Both are standard exam questions.

COMMON MISTAKE: Quicksort’s worst case ($O(n^2)$) occurs when the pivot is always the smallest or largest element (e.g., on already-sorted input with last-element pivot). Random pivot selection avoids this in practice.

Table of Contents