Sorting Algorithms: From Bubble Sort to Quick Select
Understand every major sorting algorithm — their mechanics, time complexities, stability, and when to use each. Includes merge sort, quick sort, counting sort, and quick select.
Sorting at a Glance
Comparison-based sorts cannot beat O(n log n) average case. Merge sort guarantees O(n log n) always but uses O(n) space. Quick sort is O(n log n) average with O(1) extra space but O(n²) worst case. For integers in a known range, counting sort and radix sort achieve O(n). Knowing which to use and why is what separates strong engineers.
Merge Sort: Divide & Conquer
Merge sort splits the array in half, recursively sorts both halves, and merges them. It's stable, predictable O(n log n), and great for linked lists and external sorting. The merge step is the key — two sorted arrays can be merged in O(n).
def merge_sort(arr: list[int]) -> list[int]:
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left: list[int], right: list[int]) -> list[int]:
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return resultQuick Sort: Partition & Conquer
Quick sort picks a pivot, partitions elements into less-than and greater-than groups, and recurses on each partition. It's in-place (O(log n) stack space) and cache-friendly. The Lomuto partition scheme is simpler; Hoare's is faster in practice.
import random
def quick_sort(arr: list[int], lo: int = 0, hi: int = -1) -> None:
if hi == -1:
hi = len(arr) - 1
if lo >= hi:
return
pivot_idx = partition(arr, lo, hi)
quick_sort(arr, lo, pivot_idx - 1)
quick_sort(arr, pivot_idx + 1, hi)
def partition(arr: list[int], lo: int, hi: int) -> int:
"""Lomuto partition with random pivot."""
pivot_idx = random.randint(lo, hi)
arr[pivot_idx], arr[hi] = arr[hi], arr[pivot_idx]
pivot = arr[hi]
i = lo
for j in range(lo, hi):
if arr[j] < pivot:
arr[i], arr[j] = arr[j], arr[i]
i += 1
arr[i], arr[hi] = arr[hi], arr[i]
return iRandom pivot selection avoids O(n²) worst case on already-sorted input. In practice, median-of-three pivot selection is even better. Python's built-in sort uses Timsort — a hybrid of merge sort and insertion sort optimized for real-world data.
Quick Select: Kth Element in O(n)
Quick select uses the same partition as quick sort but only recurses on the side containing the target index. Average O(n) time, O(1) space. It's how you find the kth largest/smallest without fully sorting.
def quick_select(arr: list[int], k: int) -> int:
"""Find kth smallest element (0-indexed). Average O(n)."""
lo, hi = 0, len(arr) - 1
while lo <= hi:
pivot_idx = partition(arr, lo, hi)
if pivot_idx == k:
return arr[k]
elif pivot_idx < k:
lo = pivot_idx + 1
else:
hi = pivot_idx - 1
return arr[lo]Counting Sort: Linear Time
When values are integers in a bounded range [0, k], counting sort achieves O(n + k) time. Count occurrences of each value, compute prefix sums, then place elements. It's stable and forms the basis for radix sort.
def counting_sort(arr: list[int]) -> list[int]:
if not arr:
return arr
max_val = max(arr)
count = [0] * (max_val + 1)
for num in arr:
count[num] += 1
result = []
for val, freq in enumerate(count):
result.extend([val] * freq)
return resultComparison sort decision guide: Use merge sort when stability matters or for linked lists. Use quick sort for in-place sorting of arrays. Use counting/radix sort for integers in a known range. In practice, just use your language's built-in sort (Timsort) — it's hard to beat.