There are several famous inequalities in the theory of probability. The simplest one is the Markov inequality:

for any random variable $X$ and real number $a>0$.

If we consider the random variable $(X-E[X])^2$, substitute into the Markov inequality, we have

By substituting $a=k\sigma$ where $Var[X]=\sigma^2$, we have the Chebyshev inequality

which it is required that $k>0$ and $\sigma$ is non-zero and finite.

Two other useful but more complicated probability inequalities are about deviation from mean. The Chernoff bound says that, for $n$ i.i.d. Bernoulli random variables $X_1,X_2,\cdots,X_n$, each having the probability $p>\frac{1}{2}$, the probability of having more than $n/2$ occurrences among the $n$ of them is

While these $n$ Bernoulli random variable shall produce the expectation of $np$ occurrences, the probability of deviation from $np$ is bounded by the Hoefding’s inequality, saying that for $\epsilon > 0$, the probability of no less than $n(p+\epsilon)$ occurrences is

and the probability of no more than $n(p-\epsilon)$ occurrences is

so the probability of having $k$ occurrences, which $k\in[n(p-\epsilon),n(p+\epsilon)]$, is

Hoefding’s inequality can be generalized so that, for $X_i\in[a_i,b_i]$ a.s. and $X_1,X_2,\cdots,X_n$ are independent, we have the empirical mean and its expectation

then for $t>0$ we have