6.2.2 Markov and Chebyshev Inequalities
Let $X$ be any positive continuous random variable, we can write
$EX$ | $= \int_{-\infty}^{\infty} x f_X(x) dx$ | |
$= \int_{0}^{\infty} x f_X(x) dx $ | $ \textrm{(since $X$ is positive-valued)}$ | |
$\geq \int_{a}^{\infty} x f_X(x) dx $ | $\textrm{(for any $a>0$)}$ | |
$\geq \int_{a}^{\infty} a f_X(x) dx $ | $ \textrm{(since $x>a$ in the integrated region)}$ | |
$= a \int_{a}^{\infty} f_X(x) dx $ | ||
$= a P(X \geq a).$ |
Thus, we conclude \begin{align}%\label{} P(X \geq a) \leq \frac{EX}{a}, &\qquad \textrm{for any $a>0$}. \end{align} We can prove the above inequality for discrete or mixed random variables similarly (using the generalized PDF), so we have the following result, called Markov's inequality.
Markov's Inequality
If $X$ is any nonnegative random variable, then
$P(X \geq a) \leq \frac{EX}{a},$ | $\textrm{for any $a>0$}.$ |
Example
Prove the union bound using Markov's inequality.
- Solution
- Similar to the discussion in the previous section, let $A_1, A_2, ..., A_n$ be any events and $X$ be the number events $A_i$ that occur. We saw that \begin{align} EX=P(A_1)+P(A_2)+...+P(A_n)=\sum_{i=1}^n P(A_i). \end{align} Since $X$ is a nonnegative random variable, we can apply Markov's inequality. Choosing $a=1$, we have \begin{align}%\label{} P(X \geq 1) \leq {EX}= \sum_{i=1}^n P(A_i). \end{align} But note that $P(X \geq 1)= P\biggl(\bigcup_{i=1}^n A_i\biggr)$.
Example
Let $X \sim Binomial(n,p)$. Using Markov's inequality, find an upper bound on $P(X \geq \alpha n)$, where $p< \alpha<1$. Evaluate the bound for $p=\frac{1}{2}$ and $\alpha=\frac{3}{4}$.
- Solution
- Note that $X$ is a nonnegative random variable and $EX=np$. Applying Markov's inequality, we obtain \begin{align}%\label{} P(X \geq \alpha n) &\leq \frac{EX}{\alpha n}=\frac{pn}{\alpha n}=\frac{p}{\alpha}. \end{align} For $p=\frac{1}{2}$ and $\alpha=\frac{3}{4}$, we obtain \begin{align}%\label{} P(X \geq \frac{3n}{4})\leq \frac{2}{3}. \end{align}
Chebyshev's Inequality:
Let $X$ be any random variable. If you define $Y=(X-EX)^2$, then $Y$ is a nonnegative random variable, so we can apply Markov's inequality to $Y$. In particular, for any positive real number $b$, we have \begin{align}%\label{} P(Y \geq b^2) \leq \frac{EY}{b^2}. \end{align} But note that \begin{align}%\label{} &EY=E(X-EX)^2=Var(X),\\ &P(Y \geq b^2)=P\big((X-EX)^2 \geq b^2\big)=P\big(|X-EX|\geq b\big). \end{align} Thus, we conclude that \begin{align}%\label{} P\big(|X-EX|\geq b\big) \leq \frac{Var(X)}{b^2}. \end{align} This is Chebyshev's inequality.
Chebyshev's Inequality
If $X$ is any random variable, then for any $b>0$ we have \begin{align}%\label{} P\big(|X-EX|\geq b\big) \leq \frac{Var(X)}{b^2}. \end{align}Chebyshev's inequality states that the difference between $X$ and $EX$ is somehow limited by $Var(X)$. This is intuitively expected as variance shows on average how far we are from the mean.
Example
Let $X \sim Binomial(n,p)$. Using Chebyshev's inequality, find an upper bound on $P(X \geq \alpha n)$, where $p< \alpha<1$. Evaluate the bound for $p=\frac{1}{2}$ and $\alpha=\frac{3}{4}$.
- Solution
-
One way to obtain a bound is to write
$P(X \geq \alpha n)$ $=P(X-np \geq \alpha n-np)$ $\leq P\big(|X-np|\geq n\alpha-np\big)$ $\leq \frac{Var(X)}{(n\alpha-np)^2}$ $=\frac{p(1-p)}{n(\alpha-p)^2}.$
For $p=\frac{1}{2}$ and $\alpha=\frac{3}{4}$, we obtain \begin{align}%\label{} P(X \geq \frac{3n}{4})\leq \frac{4}{n}. \end{align}
-
One way to obtain a bound is to write