6.2.6 Solved Problems
Your friend tells you that he had four job interviews last week. He says that based on how the interviews went, he thinks he has a $20\%$ chance of receiving an offer from each of the companies he interviewed with. Nevertheless, since he interviewed with four companies, he is $90\%$ sure that he will receive at least one offer. Is he right?
- Solution
-
Let $A_i$ be the event that your friend receives an offer from the $i$th company, i=1,2,3,4. Then, by the union bound:
$P\left(\bigcup_{i=1}^4 A_i\right)$ $\leq \sum P(A_i)$ $=0.2+0.2+0.2+0.2$ $=0.8$
Thus the probability of receiving at least one offer is less than or equal to $80\%$.
-
Problem
An isolated edge in a network is an edge that connects two nodes in the network such that neither of the two nodes is connected to any other nodes in the network. Let $C_n$ be the event that a graph randomly generated according to $G(n,p)$ model has at least one isolated edge.
- Show that \begin{align}%\label{} P(C_n) \leq {n \choose 2} p(1-p)^{2(n-2)} \end{align}
- Show that, for any constant $b >\frac{1}{2}$, if $p=p_n=b \frac{\ln (n)}{n}$ then \begin{align}%\label{} \lim_{n\rightarrow \infty} P(C_n)=0. \end{align}
- Solution
-
There are ${n \choose 2}$ possible edges in the graph. Let $E_i$ be the event that the $i$th edge is an isolated edge, then \begin{equation} P(E_i)=p(1-p)^{2(n-2)}, \end{equation} where $p$ in the above equation is the probability that the $i$th edge is present and $(1-p)^{2(n-2)}$ is the probability that no other nodes are connected to this edge. By the union bound, we have
$P(C_n)$ $= P\left(\bigcup{E_i}\right)$ $\leq \sum_i P(E_i)$ $= {n \choose 2} p(1-p)^{2(n-2)},$
which is the desired result. Now, let $p=b\frac{\ln n}{n}$, where $b>\frac{1}{2}$.Here, it is convenient to use the following inequality: \begin{equation} 1-x \leq e^{-x}, \quad \text{for all} \quad x \in \mathbb{R}. \end{equation} You can prove it by differentiating $f(x)=e^{-x}+x-1$, and showing that the minimum occurs at $x=0$.
Now, we can write
$P(C_n)$ $\leq {n \choose 2}p(1-p)^{2(n-2)}$ $= \frac{n(n-1)}{2} \frac{b\ln n}{n} (1-p)^{2(n-2)}$ $\leq\frac{(n-1)b \ln n}{2}e^{-2p(n-2)} \quad (\text{using} \quad 1-x \leq e^{-x})$ $= \frac{(n-1)}{2} b \ln n \ e^{-2 \frac{b \ln n}{n}(n-2)}.$
Thus,$\lim _{n\rightarrow \infty}P(C_n)$ $\leq \lim _{n\rightarrow \infty}\frac{(n-1)}{2} b \ln n \ e^{-2 \frac{b \ln n}{n}(n-2)}$ $= \lim _{n\rightarrow \infty}\frac{(n-1)}{2} b \ln n \ n^{-2b}$ $=\frac{b}{2}\lim _{n\rightarrow \infty}(n^{1-2b} \ln n)$ $= 0 \quad (\text{since} \quad b > \frac{1}{2}).$
-
Problem
Let $X \sim Exponential(\lambda)$. Using Markov's inequality find an upper bound for $P(X \geq a)$, where $a>0$. Compare the upper bound with the actual value of $P(X \geq a)$.
- Solution
-
If $X \sim Exponential (\lambda)$, then $EX=\frac{1}{\lambda}$, using Markov's inequality \begin{equation} P\left(X \geq a\right) \leq \frac{EX}{a}=\frac{1}{\lambda a}. \end{equation} The actual value of $P(X \geq a)$ is $e^{- \lambda a}$, and we always have $ \frac{1}{\lambda a} \geq e^{- \lambda a}$.
-
Problem
Let $X \sim Exponential(\lambda)$. Using Chebyshev's inequality find an upper bound for $P(|X-EX| \geq b)$, where $b>0$.
- Solution
-
- We have $EX=\frac{1}{\lambda}$ and $Var X=\frac{1}{\lambda^2}$. Using Chebyshev's inequality, we have
$P\left(|X-EX| \geq b\right)$ $\leq \frac{Var(X)}{b^2}$ $= \frac{1}{\lambda ^2 b^2}$.
- We have $EX=\frac{1}{\lambda}$ and $Var X=\frac{1}{\lambda^2}$. Using Chebyshev's inequality, we have
-
Problem
Let $X \sim Exponential(\lambda)$. Using Chernoff bounds find an upper bound for $P(X \geq a)$, where $a>EX$. Compare the upper bound with the actual value of $P(X \geq a)$.
- Solution
-
If $X \sim Exponential(\lambda)$, then \begin{equation} M_X(s)=\frac{\lambda}{\lambda-s}, \quad \text{for} \quad s<\lambda. \end{equation} Using Chernoff bounds, we have \begin{align} P\left(X \geq a\right) &\leq \min_{s>0} \left[e^{-sa}M_X(s)\right]\\ &= \min_{s>0} \left[e^{-sa} \frac{\lambda}{\lambda-s}\right]. \end{align} If $f(s)=e^{-sa} \frac{\lambda}{\lambda-s}$, to find $\min_{s>0} f(s)$ we write \begin{equation} \frac{d}{ds} f(s)=0. \end{equation} Therefore, \begin{equation} s^{*}=\lambda-\frac{1}{a}. \end{equation} Note since $a>EX=\frac{1}{\lambda}$, then $\lambda -\frac{1}{a}>0$. Thus, \begin{equation} P\left(X \geq a\right) \leq e^{-s^{*}a} \frac{\lambda}{\lambda-s^{*}}= a \lambda e^{1-\lambda a}. \end{equation} The real value of $P\left(X \geq a\right)$ is $e^{-\lambda a}$ and we have $e^{-\lambda a} \leq a\lambda e^{1-\lambda a}$, or equivalently, $a \lambda e \geq 1$, which is true since $a> \frac{1}{\lambda}$.
-
Problem
Let $X$ and $Y$ be two random variables with $EX=1, Var(X)=4$, and $EY=2, Var(Y)=1$. Find the maximum possible value for $E[XY]$.
- Solution
-
Using $\rho(X,Y) \leq 1$ and $\rho(X,Y)=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}$, we conclude \begin{equation} \frac{EXY-EXEY}{\sigma_X\sigma_Y} \leq 1. \end{equation} Thus
$EXY$ $\leq \sigma_X \sigma_Y +EXEY $ $=2 \times 1 + 2 \times 1$ $=4.$
In fact, we can achieve $EXY=4$, if we choose $Y=aX+b$. \begin{equation} Y=aX+b \quad \Rightarrow \quad \left \{\begin{array}{rcl} 2=a+b\\ \quad \\ 1=(a^2)(4) \end{array}\right. \end{equation} Solving for $a$ and $b$, we obtain \begin{align} a=\frac{1}{2}, \quad b=\frac{3}{2}. \end{align}Note that if you use the Cauchy-Schwarz inequality directly, you obtain:
$|EXY|^2$ $\leq EX^2 \cdot EY^2$ $= 5\times 5$.
Thus \begin{equation} EXY \leq 5. \end{equation} But $EXY=5$ cannot be achieved because equality in the Cauchy-Schwarz is obtained only when $Y=\alpha X$. But here this is not possible.
-
Problem
(Hölder's Inequality) Prove \begin{align}%\label{} E\left[|XY|\right] \leq E\big[|X|^p\big]^{\frac{1}{p}} E\big[|Y|^q\big]^{\frac{1}{q}}, \end{align} where $1<p,q<\infty$ and $\frac{1}{p}+\frac{1}{q}=1$. Note that, for $p=q=\frac{1}{2}$, Hölder's ineqality becomes the Cauchy-Schwarz inequality. Hint: You can use Young's inequality [4] which states that for nonnegative real numbers $\alpha$ and $\beta$ and integers $p$ and $q$ such that $1<p,q<\infty$ and $\frac{1}{p}+\frac{1}{q}=1$, we have \begin{align}%\label{} \alpha \beta \leq \frac{\alpha^p}{p}+\frac{\beta^q}{q}, \end{align} with equality only if $\alpha^p=\beta^q$.
- Solution
-
Using Young's inequality, we conclude that for random variables $U$ and $V$ we have \begin{equation} E|UV| \leq \frac{E|U|^p}{p} + \frac{E|V|^q}{q}. \end{equation} Choose $U=\frac{|X|}{\left(E|X|^p\right)^\frac{1}{p}}$ and $V=\frac{|Y|}{\left(E|Y|^q\right)^\frac{1}{q}}$. We obtain
$\frac{E|XY|}{\left(E|X|^p\right)^{\frac{1}{p}}\left(E|Y|^q\right)^{\frac{1}{q}}}$ $\leq \frac{E|X|^p}{pE|X|^p} + \frac{E|Y|^q}{qE|Y|^q}$ $=\frac{1}{p}+\frac{1}{q}$ $= 1.$
-
Problem
Show that if $h:\mathbb{R}\mapsto \mathbb{R}$ is convex and non-decreasing, and $g:\mathbb{R}\mapsto \mathbb{R}$ is convex, then $h(g(x))$ is a convex function.
- Solution
-
Since g is convex, we have \begin{equation} g(\alpha x+ (1-\alpha)y) \leq \alpha g(x) + (1-\alpha) g(y), \quad \textrm{for all } \alpha \in [0,1]. \end{equation} Therefore, we have
$h(g(\alpha x+(1- \alpha) y))$ $\leq h(\alpha g(x)+(1-\alpha)g(y)) \quad (\text{h is non-decreasing)}$ $\leq \alpha h(g(x))+(1-\alpha) h(g(y)) \quad (\text{h is convex}).$
-
Let $X$ be a positive random variable with $EX=10$. What can you say about the following quantities?
- $E\big[\frac{1}{X+1}\big]$
- $E\big[e^{\frac{1}{X+1}}\big]$
- $E[\ln \sqrt{X}]$
- Solution
-
-
$g(x)$ $=\frac{1}{x+1}$, $g^{''}(x)$ $= \frac{2}{(1+x)^3} > 0, \hspace{10pt} \textrm{for } x>0.$
Thus $g$ is convex on $(0,\infty)$$E\left[\frac{1}{X+1}\right]$ $\geq \frac{1}{1+EX} \quad (\text{Jensen's inequality})$ $=\frac{1}{1+10}$ $=\frac{1}{11}.$
-
If we let $h(x)=e^x , g(x)=\frac{1}{1+x}$ then $h$ is convex and non-decreasing and $g$ is convex thus by problem 8, $e^{\frac{1}{x+1}}$ is a convex function, thus
$E\left[e^{\frac{1}{1+X}}\right]$ $\geq e^{\frac{1}{1+EX}} \quad (\text{by Jensen's inequality})$ $=e^{\frac{1}{11}}.$
-
If $ g(x) = \ln{\sqrt{x}} = \frac{1}{2} \ln{x}$, then $g^{'}(x)=\frac{1}{2x}$ for $x>0$ and $g^{''}(x)=-\frac{1}{2x^2}$.
Thus $g$ is concave on $(0,\infty)$. We conclude
$E\left[\ln{\sqrt X}\right]$ $=E\left[ \frac{1}{2} \ln X \right]$ $\leq \frac{1}{2} \ln{EX} \quad \text{(by Jensen's inequality)}$ $= \frac{1}{2} \ln 10.$
-
-