4 min read
On this page

Single Variable Calculus

Calculus is the mathematics of change and accumulation. It provides the tools for optimization, approximation, and analysis that underpin machine learning, numerical methods, and algorithm analysis.

Limits

The limit of f(x) as x approaches a:

lim_{x→a} f(x) = L

means f(x) can be made arbitrarily close to L by taking x sufficiently close to a.

Formal Definition (ε-δ)

For every ε > 0, there exists δ > 0 such that:

0 < |x - a| < δ  ⟹  |f(x) - L| < ε

Limit Laws

If lim f(x) = L and lim g(x) = M:

  • lim (f ± g) = L ± M
  • lim (f · g) = L · M
  • lim (f / g) = L / M (if M ≠ 0)
  • lim f(x)ⁿ = Lⁿ

One-Sided Limits

  • Left limit: lim_{x→a⁻} f(x) — approach from below
  • Right limit: lim_{x→a⁺} f(x) — approach from above
  • The (two-sided) limit exists iff both one-sided limits exist and are equal.

Limits at Infinity

lim_{x→∞} 1/x = 0
lim_{x→∞} eˣ = ∞
lim_{x→∞} (1 + 1/x)ˣ = e

Important Limits

lim_{x→0} sin(x)/x = 1
lim_{x→0} (eˣ - 1)/x = 1
lim_{x→0} (1 + x)^(1/x) = e
lim_{n→∞} (1 + r/n)ⁿ = eʳ

Continuity

f is continuous at a if:

  1. f(a) is defined
  2. lim_{x→a} f(x) exists
  3. lim_{x→a} f(x) = f(a)

Types of discontinuity:

  • Removable: Limit exists but ≠ f(a) or f(a) undefined.
  • Jump: Left and right limits exist but differ.
  • Infinite: f(x) → ±∞ as x → a.

Key theorems for continuous functions on [a, b]:

  • Intermediate Value Theorem (IVT): f takes every value between f(a) and f(b). Used for root-finding (bisection method).
  • Extreme Value Theorem: f attains a maximum and minimum.

Derivatives

The derivative of f at x:

f'(x) = lim_{h→0} (f(x+h) - f(x)) / h

Measures the instantaneous rate of change (slope of the tangent line).

Differentiation Rules

| Rule | Formula | |---|---| | Constant | d/dx [c] = 0 | | Power | d/dx [xⁿ] = nxⁿ⁻¹ | | Sum | (f + g)' = f' + g' | | Product | (fg)' = f'g + fg' | | Quotient | (f/g)' = (f'g - fg') / g² | | Chain | (f∘g)' = f'(g(x)) · g'(x) | | Inverse | (f⁻¹)'(y) = 1 / f'(f⁻¹(y)) |

Common Derivatives

d/dx [eˣ] = eˣ              d/dx [ln x] = 1/x
d/dx [sin x] = cos x        d/dx [cos x] = -sin x
d/dx [tan x] = sec² x       d/dx [aˣ] = aˣ ln a
d/dx [arcsin x] = 1/√(1-x²) d/dx [arctan x] = 1/(1+x²)

Higher-Order Derivatives

f''(x) = d²f/dx² (second derivative — concavity, acceleration).

f⁽ⁿ⁾(x) = dⁿf/dxⁿ (n-th derivative).

Mean Value Theorem

If f is continuous on [a, b] and differentiable on (a, b), there exists c ∈ (a, b) such that:

f'(c) = (f(b) - f(a)) / (b - a)

The instantaneous rate equals the average rate somewhere.

Consequences:

  • If f'(x) = 0 on an interval, f is constant.
  • If f'(x) > 0 on an interval, f is increasing.
  • If f'(x) = g'(x) on an interval, f and g differ by a constant.

L'Hôpital's Rule

For indeterminate forms 0/0 or ∞/∞:

lim f(x)/g(x) = lim f'(x)/g'(x)

(if the right-hand limit exists).

Example: lim_{x→0} sin(x)/x = lim_{x→0} cos(x)/1 = 1.

Other indeterminate forms (0·∞, ∞-∞, 0⁰, 1^∞, ∞⁰) can be converted to 0/0 or ∞/∞ using algebraic manipulation or logarithms.

Taylor Series

The Taylor series of f about x = a:

f(x) = Σₙ₌₀^∞ f⁽ⁿ⁾(a)/n! · (x - a)ⁿ

When a = 0, this is the Maclaurin series.

Important Taylor Series

eˣ = 1 + x + x²/2! + x³/3! + ...              (converges for all x)
sin x = x - x³/3! + x⁵/5! - ...               (converges for all x)
cos x = 1 - x²/2! + x⁴/4! - ...               (converges for all x)
ln(1+x) = x - x²/2 + x³/3 - ...               (converges for -1 < x ≤ 1)
1/(1-x) = 1 + x + x² + x³ + ...               (converges for |x| < 1)
(1+x)ᵅ = 1 + αx + α(α-1)x²/2! + ...          (binomial series)

Taylor's Theorem with Remainder

f(x) = Σₖ₌₀ⁿ f⁽ᵏ⁾(a)/k! · (x-a)ᵏ + Rₙ(x)

Lagrange remainder: Rₙ(x) = f⁽ⁿ⁺¹⁾(c)/(n+1)! · (x-a)ⁿ⁺¹ for some c between a and x.

This bounds the approximation error, crucial for numerical methods.

Integration

Riemann Integration

The definite integral ∫ₐᵇ f(x)dx is the signed area under f from a to b.

Defined as the limit of Riemann sums:

∫ₐᵇ f(x)dx = lim_{n→∞} Σᵢ₌₁ⁿ f(xᵢ*) Δx

Fundamental Theorem of Calculus

Part 1: If F(x) = ∫ₐˣ f(t)dt, then F'(x) = f(x).

Part 2: ∫ₐᵇ f(x)dx = F(b) - F(a) where F' = f.

This connects differentiation and integration as inverse operations.

Integration Techniques

Substitution (u-substitution): ∫ f(g(x))g'(x)dx = ∫ f(u)du.

Integration by parts: ∫ u dv = uv - ∫ v du.

Partial fractions: Decompose rational functions before integrating.

Trigonometric substitution: For integrands involving √(a²-x²), √(a²+x²), √(x²-a²).

Important Integrals

∫ xⁿ dx = xⁿ⁺¹/(n+1) + C     (n ≠ -1)
∫ 1/x dx = ln|x| + C
∫ eˣ dx = eˣ + C
∫ sin x dx = -cos x + C
∫ 1/(1+x²) dx = arctan x + C
∫₀^∞ e⁻ˣ² dx = √π/2           (Gaussian integral)

Improper Integrals

Integrals with infinite limits or unbounded integrands.

∫₁^∞ 1/xᵖ dx = { 1/(p-1)  if p > 1 (converges)
                 { ∞        if p ≤ 1 (diverges)

Comparison test: If 0 ≤ f(x) ≤ g(x) and ∫g converges, then ∫f converges.

Gamma function: Γ(n) = ∫₀^∞ xⁿ⁻¹e⁻ˣ dx = (n-1)! for positive integers.

Applications in CS

  • Optimization: Finding minima/maxima via f'(x) = 0. Gradient descent uses derivatives.
  • Algorithm analysis: Integrals approximate sums: Σf(k) ≈ ∫f(x)dx. Integral test for series convergence.
  • Numerical methods: Taylor series justify finite difference approximations. Error analysis uses the remainder term.
  • Probability: PDFs integrate to 1. Expected values are integrals. The normal distribution uses e⁻ˣ².
  • Machine learning: Backpropagation is the chain rule. Loss function minimization uses calculus.
  • Information theory: Entropy h(X) = -∫f(x)ln f(x)dx (differential entropy).
  • Asymptotic analysis: Stirling's approximation (n! ≈ √(2πn)(n/e)ⁿ) is derived using calculus.