(Probability theory:)
The Chebyshev inequality gives us a bound on the deviation of a random variable from its mean, expressed in terms of its standard deviation (or variance).

Lemma. Suppose X is a random variable with finite variance σ2=Var(X)< (σ is the standard deviation of X) and expectation (mean) μ=EX. Then for all t>0,

P(|X-μ|≥t) ≤ σ2/t2

The proof is a trick application of the Markov inequality, to a specially-constructed random variable:

Proof. Let Y=(X-μ)2≥0 be another random variable. Then Y has expectation: EY=E(X-μ)2=Var(X)=σ2. This mean we can apply the Markov inequality to Y:

P(Y>s2EY) = P(Y>(sσ)2) ≤ 1/s2
But Y>(sσ)2 iff |X-μ|>sσ; taking s=t/σ yields Chebyshev's inequality.

Log in or register to write something here or to contact authors.