5 min read
On this page

Lower Bounds

A lower bound proves that no algorithm can solve a problem faster than a certain rate. Lower bounds tell us when to stop searching for faster algorithms.

Decision Tree Model

Comparison-Based Lower Bound

Model any comparison-based algorithm as a binary decision tree. Each internal node is a comparison (a ≤ b?). Each leaf is an output.

Sorting Lower Bound: Ω(n log n)

Theorem: Any comparison-based sorting algorithm requires Ω(n log n) comparisons in the worst case.

Proof: A sorting algorithm must distinguish between all n! permutations. Each comparison gives 1 bit of information. A binary tree with n! leaves has height ≥ log₂(n!).

Height ≥ log₂(n!) = Σ log₂ i ≈ n log₂ n - n log₂ e ≈ n log₂ n - 1.44n = Ω(n log n)

By Stirling: log₂(n!) = n log₂ n - n log₂ e + O(log n) = Ω(n log n).

Consequence: Merge sort and heapsort are asymptotically optimal comparison-based sorting algorithms.

Non-comparison sorts (counting sort, radix sort) bypass this bound by using additional structure of the keys.

Searching Lower Bound: Ω(log n)

Theorem: Any comparison-based searching algorithm on a sorted array requires Ω(log n) comparisons in the worst case.

Proof: Must distinguish between n+1 outcomes (element at position 0, 1, ..., n-1, or not present). Decision tree with n+1 leaves has height ≥ log₂(n+1) = Ω(log n).

Consequence: Binary search is optimal.

Adversary Arguments

An adversary answers queries adaptively to force the algorithm to do maximum work, while remaining consistent with at least one valid input.

Finding the Maximum: n - 1 Comparisons

Lower bound: Any algorithm finding the maximum of n elements requires at least n - 1 comparisons.

Adversary strategy: Maintain a set of "potential maximums" (elements that haven't lost a comparison). Each comparison eliminates at most one potential maximum. Start with n potential maximums. Need to reduce to 1. Therefore, need at least n - 1 comparisons.

Upper bound: Simple tournament (compare pairs, keep winners) uses exactly n - 1 comparisons. Tight!

Finding Both Min and Max: ⌈3n/2⌉ - 2 Comparisons

Lower bound: Track the number of elements that are still candidates for min, max, or both. Each comparison gives at most a bounded amount of information. Adversary forces ⌈3n/2⌉ - 2 comparisons.

Upper bound: Compare pairs first (n/2 comparisons), then find min among losers (n/2 - 1) and max among winners (n/2 - 1). Total: 3n/2 - 2. Tight!

Median Finding: Lower Bounds

Finding the median requires at least 2n - o(n) comparisons. The exact constant is known for several algorithms.

Median of medians (deterministic selection) uses ~5n comparisons. Better constants exist but are more complex.

Information-Theoretic Lower Bounds

Every algorithm execution produces information that distinguishes the input from other possibilities. The minimum information needed gives a lower bound.

Principle: If there are N possible answers, at least log₂ N bits of information are needed. Each binary comparison gives 1 bit.

Examples:

  • Sorting: N = n! → log₂(n!) = Ω(n log n) bits.
  • Searching: N = n → log₂ n bits.
  • Minimum: N = n → need at least log₂ n bits? No — the adversary argument gives the tighter bound of n - 1 (information-theoretic bound is too loose here).

Reduction-Based Lower Bounds

If problem A can be reduced to problem B (in time f(n)), and problem A has lower bound Ω(g(n)), then B has lower bound Ω(g(n) - f(n)).

Example: Element uniqueness (are all elements distinct?) reduces to sorting. Since element uniqueness requires Ω(n log n) comparisons in the algebraic decision tree model, sorting also requires Ω(n log n).

Example: Sorting reduces to convex hull (given n numbers xᵢ, map to points (xᵢ, xᵢ²) — the convex hull gives the sorted order). Since sorting is Ω(n log n), convex hull is Ω(n log n).

Algebraic Decision Tree Model

A generalization of the comparison model where each decision is based on a polynomial predicate of constant degree.

Ben-Or's Theorem (1983): In the algebraic decision tree model with degree d, any algorithm for a problem with W connected components requires depth Ω(log W - n).

Application: Element uniqueness has n! connected components → Ω(log n! - n) = Ω(n log n) in the algebraic decision tree model.

Significance: Even if we allow comparisons of the form "is f(x₁,...,xₙ) > 0?" for polynomials of bounded degree, sorting and element uniqueness still require Ω(n log n).

Limitations of Lower Bounds

Model-dependent: The Ω(n log n) sorting lower bound applies only to comparison-based sorting. Counting sort achieves O(n + k) by using non-comparison operations.

Gap between upper and lower bounds: For many problems, there's a gap between the best known algorithm and the best known lower bound. Closing this gap is a major research challenge.

Example: Matrix multiplication. Best upper bound: O(n^2.371). Best lower bound: Ω(n²) (must read all inputs). The gap is enormous.

Applications in CS

  • Algorithm design: Knowing a lower bound tells you when your algorithm is optimal (stop trying to improve) or when there's room for improvement.
  • Complexity theory: Lower bounds for NP-hard problems (conditional on P ≠ NP) guide the search for approximation algorithms.
  • Data structures: Lower bounds for priority queues, dictionaries, and other ADTs guide design tradeoffs.
  • Cryptography: Security relies on lower bounds for certain computational problems (factoring, discrete log).