(Probability theory:)

The Chebyshev inequality gives us a bound on the deviation of a random variable from its mean, expressed in terms of its standard deviation (or variance).

**Lemma.**
Suppose X is a random variable with finite variance σ^{2}=**Var**(X)<∞ (σ is the standard deviation of X) and expectation (mean) μ=**E**X. Then for all t>0,

**P**(|X-μ|≥t) ≤ σ^{2}/t^{2}

The proof is a trick application of the Markov inequality, to a specially-constructed random variable:

**Proof.**
Let Y=(X-μ)^{2}≥0 be another random variable. Then Y has expectation: **E**Y=**E**(X-μ)^{2}=Var(X)=σ^{2}. This mean we can apply the Markov inequality to Y:

**P**(Y>s^{2}**E**Y) = **P**(Y>(sσ)^{2}) ≤ 1/s^{2}

But Y>(sσ)

^{2} iff |X-μ|>sσ; taking s=t/σ yields Chebyshev's inequality.