# Inequalities

### Markov’s Inequalities

Let X be a non-negative random variable and suppose that E(X) exists. For any t > 0,

### Chebyshev’s inequality

Let µ = E(X) and σ^{2} = V(X) then,

where Z = (X – µ) / σ. In particular P(|Z| > 2) ≤ 1/4 and P(|Z| > 3) ≤ 1/9.

### Hoeffding’s Inequality

**Theorem 1**:

Let Y1, . . . , Yn be independent observations such that E(Yi) = 0 and ai ≤ Yi ≤ bi. Let ε > 0. Then, for any t > 0,

**Theorem 2**:

Let X1, . . . , Xn ~ Bernoulli(p). Then for any ε > 0,

### Cauchy-Schwarz inequality

If X and Y have finite variances then

## Jensen’s Inequality

If g is convex then**E _{g}(X) ≥ g(EX)**

If g is concave then**E _{g}(X) ≤ g(EX)**