7 min read
On this page

Fine-Grained Complexity

Motivation

Classical complexity theory asks whether problems are in P or NP-hard. Fine-grained complexity asks: within P, what are the exact polynomial exponents? Is O(n^2) optimal for a given problem, or can we achieve O(n^{1.99})?

Since unconditional lower bounds within P are essentially impossible with current techniques (we cannot even prove P != NP), fine-grained complexity uses conditional lower bounds: reductions from conjectured-hard problems.

Core Conjectures

SETH (Strong Exponential Time Hypothesis)

Conjecture (Impagliazzo-Paturi, 2001): For every epsilon > 0, there exists k such that k-SAT on n variables cannot be solved in O(2^{(1-epsilon)n}) time.

Weaker variant (ETH): 3-SAT cannot be solved in 2^{o(n)} time.

SETH implies ETH. SETH asserts that the brute-force exponential dependence on the number of variables is essentially optimal for SAT. Current best for k-SAT: O(2^{n(1-c/k)}) for a constant c (Paturi-Pudlak-Saks-Zane; Hertli).

SETH and circuit complexity: SETH implies that E = DTIME(2^{O(n)}) does not have non-uniform linear-size circuits.

Orthogonal Vectors (OV) Conjecture

Conjecture: Given two sets A, B of n vectors in {0,1}^d, determining whether there exist a in A, b in B with a . b = 0 (inner product zero over the integers) requires n^{2-o(1)} time when d = omega(log n).

Theorem (Williams, 2005): SETH implies the OV conjecture.

Proof: Reduce k-SAT to OV. Split the n variables into two halves. For each partial assignment to the first half, create a vector in A encoding which clauses it fails to satisfy. Similarly for B and the second half. Two vectors are orthogonal iff the combined assignment satisfies all clauses. An n^{2-epsilon} algorithm for OV would give a 2^{n(1-epsilon/2)} algorithm for SAT, violating SETH.

OV is the most versatile source of fine-grained hardness within P.

3SUM Conjecture

Conjecture: Given n integers, determining whether three of them sum to zero requires n^{2-o(1)} time.

Current status: The best algorithm runs in O(n^2 / polylog(n)) time (Gronlund-Pettie, 2014; Chan, 2018; Kane-Lovett-Moran, 2019 achieved n^2 / 2^{Omega(sqrt{log n})}). These slightly subquadratic results do not refute the conjecture (which allows o(1) in the exponent but not a constant epsilon).

3SUM in restricted models: In the linear decision tree model, O(n^{3/2}) comparisons suffice (Gronlund-Pettie). In the 4-linear decision tree model, O(n log^2 n) suffices. This shows the conjecture is model-dependent.

APSP Conjecture

Conjecture: All-Pairs Shortest Paths on n-node graphs with integer edge weights requires n^{3-o(1)} time.

Current best: O(n^3 / 2^{Omega(sqrt{log n})}) (Williams, 2014), slightly subcubic but not truly n^{3-epsilon}.

Related conjecture (BMM): Boolean Matrix Multiplication (compute the Boolean product of two n x n Boolean matrices) requires n^{3-o(1)} time if combinatorial algorithms are required (i.e., without fast matrix multiplication). This is the combinatorial BMM conjecture.

Fine-Grained Reductions

A fine-grained reduction from problem A to problem B (with conjectured time bounds t_A(n) and t_B(n)) is a transformation such that a truly faster algorithm for B (running in t_B(n)^{1-epsilon}) implies a truly faster algorithm for A (running in t_A(n)^{1-epsilon}).

Unlike classical reductions, fine-grained reductions must carefully preserve the polynomial exponent, not merely polynomial-time computability.

Reduction Taxonomy

| Source Hypothesis | Conjectured Hard Problem | Lower Bound | |---|---|---| | SETH | k-SAT | 2^{(1-o(1))n} | | OV | Orthogonal Vectors | n^{2-o(1)} | | 3SUM | Three numbers summing to 0 | n^{2-o(1)} | | APSP | All-Pairs Shortest Paths | n^{3-o(1)} | | BMM | Boolean Matrix Multiplication | n^{3-o(1)} (combinatorial) |

Conditional Lower Bounds for Key Problems

Edit Distance

Theorem (Backurs-Indyk, 2015): If SETH holds, then edit distance between two strings of length n cannot be computed in O(n^{2-epsilon}) time for any epsilon > 0.

Reduction: From OV to Edit Distance. Given OV instance (A, B), construct strings s_A and s_B of length O(n * d) such that edit_distance(s_A, s_B) reveals whether an orthogonal pair exists. The construction uses "gadget strings" that encode vector coordinates, with alignment penalties corresponding to inner products.

Current best: O(n^2 / log^2 n) (Masek-Paterson), and O(n^{2-epsilon}) approximation algorithms exist (Andoni-Krauthgamer-Onak; Chakraborty-Das-Goldenberg-Koucky-Saks).

Longest Common Subsequence (LCS)

Theorem (Abboud-Backurs-Vassilevska Williams, 2015): SETH implies LCS of two length-n strings requires n^{2-o(1)} time.

Reduction: Similar to edit distance -- from OV via string constructions. LCS and edit distance are closely related (for binary strings, computing one essentially computes the other).

Frechet Distance

Theorem (Bringmann, 2014): SETH implies that the discrete Frechet distance between two polygonal curves of n points in the plane requires n^{2-o(1)} time.

Graph Diameter and Radius

Theorem (Roditty-Vassilevska Williams, 2013): SETH implies that computing the diameter of an unweighted undirected graph on n nodes and m edges requires (nm)^{1-o(1)} time.

More precisely, distinguishing between diameter 2 and diameter 3 in sparse graphs requires n^{2-o(1)} time under SETH. This is because an n^{2-epsilon} algorithm for diameter would yield an n^{2-epsilon} algorithm for OV.

Pattern Matching

Theorem (Abboud-Vassilevska Williams-Yu, 2015): Under SETH, "pattern matching with wildcards" requires n^{2-o(1)} time in certain parameter regimes.

Dynamic Problems

Fine-grained complexity has been particularly fruitful for dynamic problems:

Theorem (Abboud-Vassilevska Williams, 2014): Under the OMv conjecture (Online Boolean Matrix-Vector multiplication requires n^{3-o(1)} total time for n operations), many dynamic graph problems require n^{1-o(1)} update or query time:

  • Dynamic reachability
  • Dynamic shortest paths (approximate)
  • Dynamic bipartite matching (approximate)

Problems from 3SUM

3SUM-hard problems (requiring n^{2-o(1)} time under the 3SUM conjecture):

  • GeomBase: Given n lines, determine if any three meet at a point
  • Point on 3 lines
  • Segment intersection counting
  • Many computational geometry problems (via Gajentaan-Overmars reductions)

Problems from APSP

APSP-equivalent problems (subcubic equivalences):

  • Negative triangle detection (does the graph contain a triangle with negative total weight?)
  • Replacement paths (shortest paths avoiding each edge)
  • Second shortest simple path
  • Minimum-weight triangle
  • Radius of a weighted graph

Theorem (Vassilevska Williams-Williams, 2010): Negative triangle and APSP are subcubically equivalent. An n^{3-epsilon} algorithm for one gives an n^{3-epsilon'} algorithm for the other.

Relationships Between Conjectures

The conjectures are not known to imply each other in general:

  • SETH implies the OV conjecture (direct reduction)
  • SETH does not obviously imply 3SUM or APSP conjectures
  • 3SUM and APSP conjectures are independent (no known reductions between them)
  • Refuting SETH would not necessarily refute 3SUM or APSP conjectures

This independence is important: it means different problems have genuinely different sources of hardness.

Nondeterministic Fine-Grained Complexity

NSETH (Nondeterministic SETH): co-k-SAT cannot be solved in O(2^{(1-epsilon)n}) nondeterministic time. Equivalent to: k-TAUT requires 2^{(1-o(1))n} co-nondeterministic time.

NSETH is strictly weaker than SETH and implies different (often tighter) lower bounds for problems with natural nondeterministic structure.

Quantum Fine-Grained Complexity

Quantum algorithms can sometimes break fine-grained barriers:

  • OV in quantum: Grover search gives O(n^{1.5} * poly(d)) quantum time, breaking the classical quadratic barrier
  • 3SUM: No better-than-classical quantum algorithm known

This suggests the OV conjecture is false in the quantum setting, while 3SUM hardness may persist.

Average-Case Fine-Grained Complexity

Theorem (Ball-Rosen-Sabin-Vasudevan, 2017): Under average-case versions of OV and SETH, the average-case complexity of certain problems (e.g., Zero-k-Clique) matches worst-case conjectured bounds.

This connects to fine-grained cryptography: if OV is hard on average, one can build fine-grained one-way functions and commitments.

Open Problems

  • Prove any unconditional superlinear lower bound for a natural problem in P (in a reasonable model)
  • Are the SETH, 3SUM, and APSP conjectures independent? Can one be derived from another?
  • What is the true complexity of edit distance? Is there an O(n^{1.9}) algorithm?
  • Can fine-grained reductions be derandomized in general?
  • Is there a "master conjecture" that unifies SETH, 3SUM, and APSP?
  • What problems in P are equivalent to each other up to subpolynomial factors?