The Complete
Math Formula
Reference

A curated collection of essential mathematical formulas across 10 major topics — from foundational algebra to advanced differential equations.

Quadratic & Polynomial

Quadratic Formula
$$x = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a}$$
Solves $ax^2+bx+c=0$. The discriminant $\Delta = b^2-4ac$ determines root nature: $\Delta>0$ two real roots, $\Delta=0$ one root, $\Delta<0$ complex roots.
Vieta's Formulas
$$x_1+x_2 = -\frac{b}{a}, \quad x_1 x_2 = \frac{c}{a}$$
Relations between roots and coefficients of $ax^2+bx+c=0$. Generalizes to higher-degree polynomials via elementary symmetric polynomials.
Binomial Theorem
$$(a+b)^n = \sum_{k=0}^{n} \binom{n}{k} a^{n-k} b^k$$
Where $\binom{n}{k} = \dfrac{n!}{k!(n-k)!}$ is the binomial coefficient. Valid for all real $a, b$ and non-negative integer $n$.
Difference of Powers
$$a^n - b^n = (a-b)\sum_{k=0}^{n-1} a^{n-1-k}b^k$$
Special cases: $a^2-b^2=(a-b)(a+b)$; $a^3-b^3=(a-b)(a^2+ab+b^2)$.

Logarithm & Exponent Rules

Logarithm Laws
$$\log_b(xy) = \log_b x + \log_b y$$ $$\log_b\!\left(\frac{x}{y}\right) = \log_b x - \log_b y$$ $$\log_b(x^n) = n\log_b x$$
Fundamental properties. Valid for $b>0, b\neq 1$, $x,y>0$.
Change of Base
$$\log_b x = \frac{\ln x}{\ln b} = \frac{\log x}{\log b}$$
Converts any logarithm to natural log ($\ln$) or common log ($\log_{10}$).
Exponent Rules
$$a^m \cdot a^n = a^{m+n}$$ $$\frac{a^m}{a^n} = a^{m-n}, \quad (a^m)^n = a^{mn}$$ $$a^{-n} = \frac{1}{a^n}, \quad a^{1/n} = \sqrt[n]{a}$$
Core rules for manipulating exponential expressions.
AM–GM Inequality
$$\frac{a_1+a_2+\cdots+a_n}{n} \geq \sqrt[n]{a_1 a_2 \cdots a_n}$$
Arithmetic mean is always $\geq$ geometric mean for non-negative reals. Equality holds iff all values are equal.

Sequences & Progressions

Arithmetic Progression
$$a_n = a_1 + (n-1)d, \quad S_n = \frac{n}{2}(a_1+a_n)$$
Common difference $d = a_{k+1} - a_k$. Sum of first $n$ terms.
Geometric Progression
$$a_n = a_1 r^{n-1}, \quad S_n = a_1\frac{1-r^n}{1-r}\ (r\neq1)$$
Common ratio $r = a_{k+1}/a_k$. For $|r|<1$: $S_\infty = \dfrac{a_1}{1-r}$.
Factorial & Permutation
$$n! = n\cdot(n-1)\cdots 2\cdot 1, \quad P(n,r) = \frac{n!}{(n-r)!}$$
$0!=1$ by convention. $P(n,r)$ counts ordered arrangements of $r$ items from $n$.
Combination
$$C(n,r) = \binom{n}{r} = \frac{n!}{r!(n-r)!}$$
Number of ways to choose $r$ items from $n$ without regard to order. Pascal's identity: $\binom{n}{r}=\binom{n-1}{r-1}+\binom{n-1}{r}$.

2D Shapes — Area & Perimeter

ShapeAreaPerimeter
Circle (radius $r$)$A = \pi r^2$$C = 2\pi r$
Triangle (base $b$, height $h$)$A = \tfrac{1}{2}bh$$P = a+b+c$
Rectangle ($w \times h$)$A = wh$$P = 2(w+h)$
Square (side $a$)$A = a^2$$P = 4a$
Trapezoid (bases $a,b$, height $h$)$A = \tfrac{1}{2}(a+b)h$$P = a+b+c+d$
Parallelogram$A = bh$$P = 2(a+b)$
Ellipse (semi-axes $a,b$)$A = \pi ab$$\approx 2\pi\sqrt{\tfrac{a^2+b^2}{2}}$
Regular $n$-gon (side $s$)$A = \tfrac{ns^2}{4}\cot(\pi/n)$$P = ns$

Triangle Theorems

Pythagorean Theorem
$$a^2 + b^2 = c^2$$
For a right triangle with legs $a, b$ and hypotenuse $c$. Generalizes to $c^2 = a^2+b^2-2ab\cos C$ (Law of Cosines).
Heron's Formula
$$A = \sqrt{s(s-a)(s-b)(s-c)}, \quad s = \frac{a+b+c}{2}$$
Area of a triangle with sides $a, b, c$ using semi-perimeter $s$. No height required.
Triangle Area (SAS)
$$A = \frac{1}{2}ab\sin C$$
Area when two sides $a, b$ and their included angle $C$ are known.
Circumradius & Inradius
$$R = \frac{abc}{4A}, \quad r = \frac{A}{s}$$
$R$ = circumscribed circle radius, $r$ = inscribed circle radius, $A$ = triangle area, $s$ = semi-perimeter.

3D Solids — Volume & Surface Area

SolidVolumeSurface Area
Sphere (radius $r$)$V = \tfrac{4}{3}\pi r^3$$SA = 4\pi r^2$
Cylinder ($r$, height $h$)$V = \pi r^2 h$$SA = 2\pi r(r+h)$
Cone ($r$, height $h$)$V = \tfrac{1}{3}\pi r^2 h$$SA = \pi r(r+l)$, $l=\sqrt{r^2+h^2}$
Cube (side $a$)$V = a^3$$SA = 6a^2$
Rectangular box ($l,w,h$)$V = lwh$$SA = 2(lw+wh+lh)$
Pyramid (base $B$, height $h$)$V = \tfrac{1}{3}Bh$depends on base
Torus ($R$, $r$)$V = 2\pi^2 Rr^2$$SA = 4\pi^2 Rr$

Analytic Geometry

Distance & Midpoint
$$d = \sqrt{(x_2-x_1)^2+(y_2-y_1)^2}$$ $$M = \left(\frac{x_1+x_2}{2},\frac{y_1+y_2}{2}\right)$$
Distance between two points and midpoint formula in 2D.
Conic Sections
$$\text{Circle: } (x-h)^2+(y-k)^2 = r^2$$ $$\text{Ellipse: } \frac{(x-h)^2}{a^2}+\frac{(y-k)^2}{b^2}=1$$ $$\text{Parabola: } y = a(x-h)^2+k$$
Standard forms with center/vertex at $(h,k)$.

Fundamental Definitions

SOH-CAH-TOA
$$\sin\theta = \frac{\text{opp}}{\text{hyp}},\quad \cos\theta = \frac{\text{adj}}{\text{hyp}},\quad \tan\theta = \frac{\text{opp}}{\text{adj}}$$
Definitions for right triangles. Reciprocals: $\csc = 1/\sin$, $\sec = 1/\cos$, $\cot = 1/\tan$.
Pythagorean Identities
$$\sin^2\theta + \cos^2\theta = 1$$ $$1 + \tan^2\theta = \sec^2\theta$$ $$1 + \cot^2\theta = \csc^2\theta$$
Derived from the Pythagorean theorem on the unit circle.

Sum & Difference Formulas

Sine Sum / Difference
$$\sin(\alpha \pm \beta) = \sin\alpha\cos\beta \pm \cos\alpha\sin\beta$$
Fundamental expansion used in many derivations.
Cosine Sum / Difference
$$\cos(\alpha \pm \beta) = \cos\alpha\cos\beta \mp \sin\alpha\sin\beta$$
Note the sign reversal: $+$ on left gives $-$ on right.
Tangent Sum / Difference
$$\tan(\alpha \pm \beta) = \frac{\tan\alpha \pm \tan\beta}{1 \mp \tan\alpha\tan\beta}$$
Undefined when denominator equals zero.
Double Angle
$$\sin 2\theta = 2\sin\theta\cos\theta$$ $$\cos 2\theta = \cos^2\theta - \sin^2\theta = 1-2\sin^2\theta$$ $$\tan 2\theta = \frac{2\tan\theta}{1-\tan^2\theta}$$
Special case of sum formulas with $\alpha = \beta = \theta$.

Half Angle & Product-to-Sum

Half Angle Formulas
$$\sin\frac{\theta}{2} = \pm\sqrt{\frac{1-\cos\theta}{2}}$$ $$\cos\frac{\theta}{2} = \pm\sqrt{\frac{1+\cos\theta}{2}}$$ $$\tan\frac{\theta}{2} = \frac{\sin\theta}{1+\cos\theta} = \frac{1-\cos\theta}{\sin\theta}$$
Sign depends on the quadrant of $\theta/2$.
Product-to-Sum
$$\sin\alpha\cos\beta = \tfrac{1}{2}[\sin(\alpha+\beta)+\sin(\alpha-\beta)]$$ $$\cos\alpha\cos\beta = \tfrac{1}{2}[\cos(\alpha-\beta)+\cos(\alpha+\beta)]$$ $$\sin\alpha\sin\beta = \tfrac{1}{2}[\cos(\alpha-\beta)-\cos(\alpha+\beta)]$$
Useful for integration of products of trig functions.

Laws of Sines & Cosines

Law of Sines
$$\frac{a}{\sin A} = \frac{b}{\sin B} = \frac{c}{\sin C} = 2R$$
Relates sides to opposite angles. $R$ is the circumradius. Used for AAS/ASA triangle solving.
Law of Cosines
$$c^2 = a^2 + b^2 - 2ab\cos C$$
Generalization of the Pythagorean theorem. Used for SAS/SSS. When $C=90°$, reduces to $c^2=a^2+b^2$.

Common Angle Values

$\theta$$0°$$30°$$45°$$60°$$90°$
$\sin\theta$$0$$\tfrac{1}{2}$$\tfrac{\sqrt{2}}{2}$$\tfrac{\sqrt{3}}{2}$$1$
$\cos\theta$$1$$\tfrac{\sqrt{3}}{2}$$\tfrac{\sqrt{2}}{2}$$\tfrac{1}{2}$$0$
$\tan\theta$$0$$\tfrac{\sqrt{3}}{3}$$1$$\sqrt{3}$undef.

Limits

L'Hôpital's Rule
$$\lim_{x\to a}\frac{f(x)}{g(x)} = \lim_{x\to a}\frac{f'(x)}{g'(x)}$$
Applies when limit is in $0/0$ or $\infty/\infty$ indeterminate form. May be applied repeatedly.
Standard Limits
$$\lim_{x\to 0}\frac{\sin x}{x} = 1,\quad \lim_{x\to\infty}\left(1+\frac{1}{x}\right)^x = e$$ $$\lim_{x\to 0}(1+x)^{1/x} = e$$
Fundamental limits used throughout calculus and analysis.

Derivative Rules

Basic Rules
$$(c)' = 0,\quad (x^n)' = nx^{n-1}$$ $$(fg)' = f'g + fg' \quad\text{(Product)}$$ $$\left(\frac{f}{g}\right)' = \frac{f'g - fg'}{g^2} \quad\text{(Quotient)}$$
Constant, power, product, and quotient rules.
Chain Rule
$$\frac{d}{dx}f(g(x)) = f'(g(x))\cdot g'(x)$$
For composite functions. Equivalently: $\dfrac{dy}{dx} = \dfrac{dy}{du}\cdot\dfrac{du}{dx}$.
Derivatives of Trig Functions
$$(\sin x)' = \cos x,\quad (\cos x)' = -\sin x$$ $$(\tan x)' = \sec^2 x,\quad (\cot x)' = -\csc^2 x$$ $$(\sec x)' = \sec x\tan x,\quad (\csc x)' = -\csc x\cot x$$
Derivatives of Inverse Trig
$$(\arcsin x)' = \frac{1}{\sqrt{1-x^2}},\quad (\arccos x)' = \frac{-1}{\sqrt{1-x^2}}$$ $$(\arctan x)' = \frac{1}{1+x^2}$$
Domain restrictions apply: $|x| \leq 1$ for $\arcsin, \arccos$.
Exponential & Log Derivatives
$$(e^x)' = e^x,\quad (a^x)' = a^x \ln a$$ $$(\ln x)' = \frac{1}{x},\quad (\log_a x)' = \frac{1}{x\ln a}$$
$e^x$ is unique: its own derivative. $a>0$, $a\neq1$.
Implicit Differentiation
$$\frac{d}{dx}[F(x,y)] = 0 \implies \frac{dy}{dx} = -\frac{F_x}{F_y}$$
Differentiate both sides w.r.t. $x$, treating $y$ as a function of $x$, then solve for $dy/dx$.

Fundamental Theorem of Calculus

FTC — Both Parts
$$\textbf{Part I:}\quad \frac{d}{dx}\int_a^x f(t)\,dt = f(x)$$ $$\textbf{Part II:}\quad \int_a^b f(x)\,dx = F(b) - F(a)$$
Where $F$ is any antiderivative of $f$, i.e. $F'=f$. The cornerstone connecting differentiation and integration.

Integration Techniques

Integration by Parts
$$\int u\,dv = uv - \int v\,du$$
Derived from the product rule. Choose $u$ using LIATE: Logarithm, Inverse trig, Algebraic, Trig, Exponential.
u-Substitution
$$\int f(g(x))g'(x)\,dx = \int f(u)\,du,\quad u=g(x)$$
Reverse of the chain rule. Change limits accordingly for definite integrals.
Standard Integrals
$$\int x^n dx = \frac{x^{n+1}}{n+1}+C \quad (n\neq -1)$$ $$\int e^x dx = e^x+C, \quad \int \frac{1}{x}dx = \ln|x|+C$$ $$\int \sin x\,dx = -\cos x+C,\quad \int \cos x\,dx = \sin x+C$$
More Integral Formulas
$$\int \frac{1}{x^2+a^2}dx = \frac{1}{a}\arctan\frac{x}{a}+C$$ $$\int \frac{1}{\sqrt{a^2-x^2}}dx = \arcsin\frac{x}{a}+C$$ $$\int \sec^2 x\,dx = \tan x+C$$

Taylor & Maclaurin Series

Taylor Series
$$f(x) = \sum_{n=0}^{\infty}\frac{f^{(n)}(a)}{n!}(x-a)^n$$
Expansion of $f$ around point $a$. Maclaurin series is the special case $a=0$.
Common Maclaurin Series
$$e^x = \sum_{n=0}^{\infty}\frac{x^n}{n!}, \quad \sin x = \sum_{n=0}^{\infty}\frac{(-1)^n x^{2n+1}}{(2n+1)!}$$ $$\cos x = \sum_{n=0}^{\infty}\frac{(-1)^n x^{2n}}{(2n)!}, \quad \frac{1}{1-x} = \sum_{n=0}^{\infty} x^n$$
Valid within radius of convergence.

Matrix Operations

Matrix Multiplication
$$(AB)_{ij} = \sum_{k=1}^{n} A_{ik} B_{kj}$$
Only defined when $A$ is $m\times n$ and $B$ is $n\times p$. Not commutative: $AB \neq BA$ in general.
Transpose Properties
$$(A^T)^T = A,\quad (AB)^T = B^T A^T$$ $$(A+B)^T = A^T + B^T$$
The transpose of a product reverses the order. Symmetric matrix: $A = A^T$.
Inverse Matrix
$$AA^{-1} = A^{-1}A = I$$ $$A^{-1} = \frac{1}{\det A}\,\text{adj}(A)$$
Exists iff $\det A \neq 0$. $(AB)^{-1} = B^{-1}A^{-1}$. For $2\times 2$: $\begin{pmatrix}a&b\\c&d\end{pmatrix}^{-1}=\frac{1}{ad-bc}\begin{pmatrix}d&-b\\-c&a\end{pmatrix}$.
Determinant (2×2, 3×3)
$$\det\begin{pmatrix}a&b\\c&d\end{pmatrix} = ad-bc$$ $$\det A = a_{11}C_{11}+a_{12}C_{12}+a_{13}C_{13}$$
Cofactor expansion along any row or column. $C_{ij} = (-1)^{i+j}M_{ij}$ where $M_{ij}$ is the minor.

Eigenvalues & Eigenvectors

Characteristic Equation
$$A\mathbf{v} = \lambda\mathbf{v} \iff \det(A - \lambda I) = 0$$
Eigenvalue $\lambda$ and eigenvector $\mathbf{v}\neq\mathbf{0}$. The characteristic polynomial has degree $n$ for an $n\times n$ matrix.
Trace & Determinant Relations
$$\text{tr}(A) = \sum_{i}\lambda_i,\quad \det(A) = \prod_{i}\lambda_i$$
Sum and product of eigenvalues equal trace and determinant respectively.

Vector Operations

Dot Product
$$\mathbf{a}\cdot\mathbf{b} = |\mathbf{a}||\mathbf{b}|\cos\theta = \sum_{i}a_i b_i$$
Scalar result. $\mathbf{a}\perp\mathbf{b}$ iff $\mathbf{a}\cdot\mathbf{b}=0$. Used to find angles and projections.
Cross Product
$$\mathbf{a}\times\mathbf{b} = \begin{vmatrix}\mathbf{i}&\mathbf{j}&\mathbf{k}\\a_1&a_2&a_3\\b_1&b_2&b_3\end{vmatrix}$$ $$|\mathbf{a}\times\mathbf{b}| = |\mathbf{a}||\mathbf{b}|\sin\theta$$
Vector result perpendicular to both $\mathbf{a}$ and $\mathbf{b}$. Magnitude equals parallelogram area.
Vector Projection
$$\text{proj}_{\mathbf{b}}\mathbf{a} = \frac{\mathbf{a}\cdot\mathbf{b}}{|\mathbf{b}|^2}\,\mathbf{b}$$
Orthogonal projection of $\mathbf{a}$ onto $\mathbf{b}$.
Gram–Schmidt Process
$$\mathbf{u}_k = \mathbf{v}_k - \sum_{j=1}^{k-1}\frac{\mathbf{v}_k\cdot\mathbf{u}_j}{|\mathbf{u}_j|^2}\mathbf{u}_j$$
Orthogonalizes a set of linearly independent vectors. Normalize $\mathbf{u}_k/|\mathbf{u}_k|$ for orthonormal basis.

Descriptive Statistics

Mean, Variance & Std Dev
$$\mu = \frac{1}{n}\sum_{i=1}^n x_i, \quad \sigma^2 = \frac{1}{n}\sum_{i=1}^n(x_i-\mu)^2$$ $$s^2 = \frac{1}{n-1}\sum_{i=1}^n(x_i-\bar{x})^2$$
$\sigma^2$ = population variance, $s^2$ = sample variance (Bessel's correction: $n-1$). $\sigma = \sqrt{\sigma^2}$.
Covariance & Correlation
$$\text{Cov}(X,Y) = \frac{1}{n}\sum(x_i-\bar{x})(y_i-\bar{y})$$ $$r = \frac{\text{Cov}(X,Y)}{\sigma_X\sigma_Y},\quad r\in[-1,1]$$
Pearson correlation $r$ measures linear association strength and direction.

Probability

Bayes' Theorem
$$P(A|B) = \frac{P(B|A)\,P(A)}{P(B)}$$
Relates conditional probabilities. $P(B) = \sum_i P(B|A_i)P(A_i)$ by total probability theorem.
Conditional & Independence
$$P(A|B) = \frac{P(A\cap B)}{P(B)}$$ $$A\perp B \iff P(A\cap B) = P(A)P(B)$$
Two events are independent iff their joint probability equals the product of marginal probabilities.
Expectation & Variance
$$E[X] = \sum_x x\,P(X=x)\ \text{(discrete)}$$ $$\text{Var}(X) = E[X^2] - (E[X])^2$$ $$\text{Var}(aX+b) = a^2\text{Var}(X)$$
Linearity: $E[aX+b]=aE[X]+b$. For independent $X,Y$: $\text{Var}(X+Y)=\text{Var}(X)+\text{Var}(Y)$.
Central Limit Theorem
$$\bar{X} \xrightarrow{d} N\!\left(\mu,\frac{\sigma^2}{n}\right) \text{ as } n\to\infty$$
Sample mean of i.i.d. random variables converges to a normal distribution regardless of the original distribution. Foundation of inferential statistics.

Key Distributions

DistributionPMF / PDFMeanVariance
Binomial $B(n,p)$$\binom{n}{k}p^k(1-p)^{n-k}$$np$$np(1-p)$
Poisson $\text{Po}(\lambda)$$\dfrac{e^{-\lambda}\lambda^k}{k!}$$\lambda$$\lambda$
Normal $N(\mu,\sigma^2)$$\dfrac{1}{\sigma\sqrt{2\pi}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}$$\mu$$\sigma^2$
Exponential $\text{Exp}(\lambda)$$\lambda e^{-\lambda x}$, $x\geq 0$$1/\lambda$$1/\lambda^2$
Uniform $U(a,b)$$\dfrac{1}{b-a}$, $a\leq x\leq b$$\dfrac{a+b}{2}$$\dfrac{(b-a)^2}{12}$

Hypothesis Testing & Confidence Intervals

z-score & t-statistic
$$z = \frac{\bar{X}-\mu_0}{\sigma/\sqrt{n}}, \quad t = \frac{\bar{X}-\mu_0}{s/\sqrt{n}}$$
Use $z$ when $\sigma$ known, $t$ when $\sigma$ unknown (with $n-1$ degrees of freedom).
Confidence Interval
$$\bar{X} \pm z_{\alpha/2}\frac{\sigma}{\sqrt{n}}$$
For 95% CI: $z_{0.025} = 1.96$. Width decreases as $n$ increases. Larger $n$ → tighter interval.

Summation Formulas

Finite Sums
$$\sum_{k=1}^n k = \frac{n(n+1)}{2}$$ $$\sum_{k=1}^n k^2 = \frac{n(n+1)(2n+1)}{6}$$ $$\sum_{k=1}^n k^3 = \left(\frac{n(n+1)}{2}\right)^2$$
Gauss's formula and higher power sums. Note $\sum k^3 = (\sum k)^2$.
Geometric Series
$$\sum_{k=0}^{n-1}ar^k = a\frac{1-r^n}{1-r},\quad \sum_{k=0}^{\infty}ar^k = \frac{a}{1-r}\ (|r|<1)$$
Infinite geometric series converges iff $|r| < 1$.

Convergence Tests

TestCondition for Convergence
Ratio Test$\displaystyle L = \lim_{n\to\infty}\left|\frac{a_{n+1}}{a_n}\right| < 1$
Root Test$\displaystyle L = \lim_{n\to\infty}\sqrt[n]{|a_n|} < 1$
Integral Test$\sum a_n$ and $\int_1^\infty f(x)\,dx$ converge/diverge together
p-series$\sum \frac{1}{n^p}$ converges iff $p > 1$
Alternating Series$|a_n|\searrow 0$ and terms alternate in sign (Leibniz)

Power Series & Famous Series

Power Series Radius
$$\sum_{n=0}^{\infty}c_n(x-a)^n, \quad R = \frac{1}{\limsup_{n\to\infty}\sqrt[n]{|c_n|}}$$
Converges for $|x-a| < R$ (Hadamard formula). Diverges for $|x-a| > R$.
Famous Sums
$$\sum_{n=1}^{\infty}\frac{1}{n^2} = \frac{\pi^2}{6} \quad\text{(Basel problem)}$$ $$\sum_{n=0}^{\infty}\frac{(-1)^n}{2n+1} = \frac{\pi}{4} \quad\text{(Leibniz)}$$ $$\sum_{n=1}^{\infty}\frac{1}{n^4} = \frac{\pi^4}{90}$$
Euler's solution to the Basel problem and other remarkable closed forms.

Definitions & Forms

Complex Number
$$z = a + bi,\quad i^2 = -1$$ $$|z| = \sqrt{a^2+b^2},\quad \bar{z} = a-bi$$
$a = \text{Re}(z)$, $b = \text{Im}(z)$. $z\bar{z} = |z|^2$. Modulus $|z|$ is the distance from origin.
Polar Form
$$z = r(\cos\theta + i\sin\theta) = re^{i\theta}$$ $$r = |z|,\quad \theta = \arg(z) = \arctan\!\left(\frac{b}{a}\right)$$
Multiplication in polar form: $z_1 z_2 = r_1 r_2\, e^{i(\theta_1+\theta_2)}$.

Fundamental Theorems

Euler's Formula
$$e^{i\theta} = \cos\theta + i\sin\theta$$
One of the most beautiful equations in mathematics. Setting $\theta=\pi$: $e^{i\pi}+1=0$ (Euler's identity).
De Moivre's Theorem
$$(\cos\theta + i\sin\theta)^n = \cos(n\theta) + i\sin(n\theta)$$
Used to find powers and $n$-th roots of complex numbers. Roots: $z_k = r^{1/n}e^{i(\theta+2k\pi)/n}$, $k=0,\ldots,n-1$.
Trig from Euler
$$\cos\theta = \frac{e^{i\theta}+e^{-i\theta}}{2},\quad \sin\theta = \frac{e^{i\theta}-e^{-i\theta}}{2i}$$
Express trigonometric functions using complex exponentials.
Hyperbolic Functions
$$\cosh x = \frac{e^x+e^{-x}}{2},\quad \sinh x = \frac{e^x-e^{-x}}{2}$$ $$\cosh^2 x - \sinh^2 x = 1$$
Analog of trig functions. Connection: $\cos(ix) = \cosh x$, $\sin(ix) = i\sinh x$.

First-Order ODEs

Separable Equations
$$\frac{dy}{dx} = f(x)g(y) \implies \int\frac{dy}{g(y)} = \int f(x)\,dx$$
Separate variables $x$ and $y$ to opposite sides, then integrate both.
Linear First-Order ODE
$$y'+P(x)y = Q(x)$$ $$\text{Integrating factor: }\mu = e^{\int P\,dx}$$ $$y = \frac{1}{\mu}\left(\int \mu Q\,dx + C\right)$$
Multiply through by integrating factor $\mu$ to make left side an exact derivative.

Second-Order Linear ODEs

Homogeneous with Const. Coefficients
$$ay''+by'+cy=0 \implies \text{char. eq. } ar^2+br+c=0$$
Two distinct real roots $r_1,r_2$: $y = C_1 e^{r_1 x}+C_2 e^{r_2 x}$
Repeated root $r$: $y = (C_1+C_2 x)e^{rx}$
Complex roots $\alpha\pm\beta i$: $y = e^{\alpha x}(C_1\cos\beta x+C_2\sin\beta x)$
Variation of Parameters
$$y_p = y_1\int\frac{-y_2 g}{W}dx + y_2\int\frac{y_1 g}{W}dx$$ $$W = \begin{vmatrix}y_1&y_2\\y_1'&y_2'\end{vmatrix}$$
$W$ is the Wronskian. $y_1, y_2$ are linearly independent solutions to the homogeneous equation.

Laplace Transform

Definition
$$\mathcal{L}\{f(t)\} = F(s) = \int_0^{\infty}e^{-st}f(t)\,dt$$
Transforms an ODE in $t$ into an algebraic equation in $s$. Solve in $s$-domain, then invert.
$f(t)$$\mathcal{L}\{f(t)\} = F(s)$
$1$$\dfrac{1}{s}$
$t^n$$\dfrac{n!}{s^{n+1}}$
$e^{at}$$\dfrac{1}{s-a}$
$\sin(at)$$\dfrac{a}{s^2+a^2}$
$\cos(at)$$\dfrac{s}{s^2+a^2}$
$f'(t)$$sF(s) - f(0)$
$f''(t)$$s^2F(s)-sf(0)-f'(0)$

Divisibility & GCD

Division Algorithm
$$a = bq + r,\quad 0 \leq r < b$$
For integers $a, b > 0$, there exist unique $q$ (quotient) and $r$ (remainder). Foundation for GCD algorithms.
Euclidean Algorithm
$$\gcd(a,b) = \gcd(b,\, a \bmod b)$$ $$\gcd(a,0) = a$$
Repeated application reduces to the base case. Also finds $x,y$ with $ax+by=\gcd(a,b)$ (extended GCD).
Bézout's Identity
$$\gcd(a,b) = d \implies \exists\, x,y\in\mathbb{Z}: ax+by=d$$
Linear Diophantine $ax+by=c$ has integer solutions iff $\gcd(a,b) \mid c$.
Fundamental Theorem of Arithmetic
$$n = p_1^{e_1} p_2^{e_2}\cdots p_k^{e_k}, \quad p_i \text{ prime}$$
Every integer $n > 1$ has a unique prime factorization (up to order). Basis for much of number theory.

Modular Arithmetic & Theorems

Fermat's Little Theorem
$$a^p \equiv a \pmod{p}, \quad \gcd(a,p)=1 \implies a^{p-1}\equiv 1$$
$p$ is prime. Basis of RSA cryptography. Used for fast modular exponentiation.
Euler's Theorem
$$a^{\phi(n)} \equiv 1 \pmod{n},\quad \gcd(a,n)=1$$ $$\phi(n) = n\prod_{p|n}\!\left(1-\frac{1}{p}\right)$$
Euler's totient $\phi(n)$ counts integers $\leq n$ coprime to $n$. Generalizes Fermat's little theorem.
Chinese Remainder Theorem
$$x \equiv a_i \pmod{n_i},\ i=1\ldots k$$ $$\text{unique solution mod } N = n_1 n_2\cdots n_k$$
If $n_1,\ldots,n_k$ are pairwise coprime, the system has a unique solution modulo $N$.
Wilson's Theorem
$$p \text{ prime} \iff (p-1)! \equiv -1 \pmod{p}$$
Characterization of prime numbers using factorials. Theoretically elegant but computationally impractical for primality testing.