7.1.3 Solved Problems
There are 100 men on a plane. Let $X_i$ be the weight (in pounds) of the ith man on the plane. Suppose that the $X_i$'s are i.i.d., and $EX_i = \mu = 170$ and $\sigma_{{\large X_i}} = \sigma = 30$. Find the probability that the total weight of the men on the plane exceeds 18,000 pounds.
- Solution
- If $W$ is the total weight, then $W=X_1+X_2+\cdots+X_n$, where $n=100$. We have \begin{align} EW&=n\mu \\ &=(100)(170)\\ &=17000,\\ \mathrm{Var}(W)&=100\mathrm{Var}(X_i) \\ &=(100)(30)^2 \\ &=90000. \end{align} Thus, $\sigma_W=300$. We have \begin{align} P(W > 18000) &= P\left(\frac{W-17000}{300} > \frac{18000-17000}{300}\right)\\ &=P\left(\frac{W-17000}{300} > \frac{10}{3}\right)\\ &= 1- \Phi\left(\frac{10}{3}\right) \quad (\textrm{by CLT}) \\ &\approx 4.3\times10^{-4}. \end{align}
Problem
Let $X_1$, $X_2$, $\cdots$, $X_{25}$ be i.i.d. with the following PMF
\begin{equation} \nonumber P_X(k) = \left\{ \begin{array}{l l} 0.6 & \quad k=1 \\ 0.4 & \quad k=-1 \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} And let \begin{align}%\label{} Y=X_1+X_2+\cdots+X_n. \end{align} Using the CLT and continuity correction, estimate $P(4 \leq Y \leq 6)$.- Solution
- We have \begin{align} EX_i&=(0.6)(1)+(0.4)(-1) \\ &=\frac{1}{5}, \end{align} \begin{align} EX_i^2&=0.6+0.4 \\ &=1. \end{align} Therefore, \begin{align} \mathrm{Var}(X_i)&=1-\frac{1}{25} \\ &=\frac{24}{25}; \\ \textrm{thus,} \quad \sigma_{X_i}&=\frac{2\sqrt{6}}{5}. \end{align} Therefore, \begin{align} EY&=25 \times \frac{1}{5} \\ &=5, \end{align} \begin{align} \mathrm{Var}(Y)&=25 \times \frac{24}{25} \\ &=24; \\ \textrm{thus,} \quad \sigma_Y&=2\sqrt{6}. \end{align} \begin{align} P(4 \leq Y \leq 6) &= P(3.5 \leq Y \leq 6.5) \quad (\textrm{continuity correction}) \\ &=P\left(\frac{3.5-5}{2\sqrt{6}} \leq \frac{Y-5}{2\sqrt{6}} \leq \frac{6.5-5}{2\sqrt{6}}\right) \\ &= P\left(-0.3062 \leq \frac{Y-5}{2\sqrt{6}} \leq +0.3062\right) \\ &\approx \Phi(0.3062)-\Phi(-0.3062) \quad (\textrm{by the CLT}) \\ &= 2\Phi(0.3062)-1\\ &\approx 0.2405 \end{align}
Problem
You have invited $64$ guests to a party. You need to make sandwiches for the guests. You believe that a guest might need $0$, $1$ or $2$ sandwiches with probabilities $\frac{1}{4}$, $\frac{1}{2}$, and $\frac{1}{4}$ respectively. You assume that the number of sandwiches each guest needs is independent from other guests. How many sandwiches should you make so that you are $95\%$ sure that there is no shortage?
- Solution
-
Let $X_i$ be the number of sandwiches that the $i$th person needs, and let
\begin{align}
Y=X_1+X_2+\cdots+X_{64}.
\end{align}
The goal is to find $y$ such that
\begin{equation}
P(Y \leq y) \geq 0.95
\end{equation}
First note that
\begin{align}
EX_i&=\frac{1}{4}(0)+\frac{1}{2}(1)+\frac{1}{4}(2) \\
&=1,
\end{align}
\begin{align}
EX_{i}^2&=\frac{1}{4}(0^2)+\frac{1}{2}(1^2)+\frac{1}{4}(2^2) \\
&=\frac{3}{2}.
\end{align}
Thus,
\begin{align}
\mathrm{Var} (X_i)&= EX_i^2-(EX_i)^2 \\
&=\frac{3}{2}-1\\
&=\frac{1}{2} \quad \rightarrow \quad \sigma_{X_i}=\frac{1}{\sqrt{2}}.
\end{align}
Thus,
\begin{align}
EY&= 64 \times 1 \\
&=64,
\end{align}
\begin{align}
\mathrm{Var}(Y)&= 64 \times \frac{1}{2} \\
&=32 \rightarrow \sigma_Y=4\sqrt{2}.
\end{align}
Now, we can use the CLT to find $y$
\begin{align}
P(Y \leq y)&=P\left(\frac{Y-64}{4\sqrt{2}} \leq \frac{y-64}{4\sqrt{2}}\right) \\
&= \Phi\left(\frac{y-64}{4\sqrt{2}}\right) \quad (\textrm{by CLT}).
\end{align}
We can write
\begin{equation}
\Phi\left(\frac{y-64}{4\sqrt{2}}\right)=0.95
\end{equation}
Therefore,
\begin{align}
\frac{y-64}{4\sqrt{2}} &= \Phi^{-1}(0.95) \\
&\approx 1.6449
\end{align}
Thus, $y=73.3$.
Therefore, if you make 74 sandwiches, you are $95\%$ sure that there is no shortage. Note that you can find the numerical value of $\Phi^{-1}(0.95)$ by running the norminv(0.95) command in MATLAB.
-
Let $X_i$ be the number of sandwiches that the $i$th person needs, and let
\begin{align}
Y=X_1+X_2+\cdots+X_{64}.
\end{align}
The goal is to find $y$ such that
\begin{equation}
P(Y \leq y) \geq 0.95
\end{equation}
First note that
\begin{align}
EX_i&=\frac{1}{4}(0)+\frac{1}{2}(1)+\frac{1}{4}(2) \\
&=1,
\end{align}
\begin{align}
EX_{i}^2&=\frac{1}{4}(0^2)+\frac{1}{2}(1^2)+\frac{1}{4}(2^2) \\
&=\frac{3}{2}.
\end{align}
Thus,
\begin{align}
\mathrm{Var} (X_i)&= EX_i^2-(EX_i)^2 \\
&=\frac{3}{2}-1\\
&=\frac{1}{2} \quad \rightarrow \quad \sigma_{X_i}=\frac{1}{\sqrt{2}}.
\end{align}
Thus,
\begin{align}
EY&= 64 \times 1 \\
&=64,
\end{align}
\begin{align}
\mathrm{Var}(Y)&= 64 \times \frac{1}{2} \\
&=32 \rightarrow \sigma_Y=4\sqrt{2}.
\end{align}
Now, we can use the CLT to find $y$
\begin{align}
P(Y \leq y)&=P\left(\frac{Y-64}{4\sqrt{2}} \leq \frac{y-64}{4\sqrt{2}}\right) \\
&= \Phi\left(\frac{y-64}{4\sqrt{2}}\right) \quad (\textrm{by CLT}).
\end{align}
We can write
\begin{equation}
\Phi\left(\frac{y-64}{4\sqrt{2}}\right)=0.95
\end{equation}
Therefore,
\begin{align}
\frac{y-64}{4\sqrt{2}} &= \Phi^{-1}(0.95) \\
&\approx 1.6449
\end{align}
Thus, $y=73.3$.
Problem
Let $X_1$ , $X_2$, $\cdots$, $X_n$ be i.i.d. $Exponential(\lambda)$ random variables with $\lambda=1$. Let
\begin{align}%\label{} \overline{X}=\frac{X_1+X_2+\cdots+X_n}{n}. \end{align} How large should $n$ be such that \begin{align}%\label{} P\left(0.9 \leq \overline{X} \leq 1.1\right) \geq 0.95 \textrm{?} \end{align}- Solution
- Let $Y=X_1+X_2+\cdots+X_n$, so $\overline{X}=\frac{Y}{n}$. Since $X_i \sim Exponential(1)$, we have \begin{align} E(X_i)=\frac{1}{\lambda}=1,\qquad \mathrm{Var}(X_i)=\frac{1}{\lambda^2}=1. \end{align} Therefore, \begin{align} E(Y)=n EX_i=n,\qquad \mathrm{Var}(Y)=n \mathrm{Var}(X_i)=n, \end{align} \begin{align} P(0.9 \leq \overline{X} \leq 1.1) &= P\left(0.9 \leq \frac{Y}{n} \leq 1.1\right)\\ &=P\big(0.9n \leq Y \leq 1.1n\big)\\ &=P\left(\frac{0.9n-n}{\sqrt{n}} \leq \frac{Y-n}{\sqrt{n}}\leq \frac{1.1n-n}{\sqrt{n}}\right)\\ &= P\left(-0.1\sqrt{n} \leq \frac{Y-n}{\sqrt{n}}\leq 0.1\sqrt{n}\right). \end{align} By the CLT $\frac{Y-n}{\sqrt{n}}$ is approximately $N(0,1)$, so \begin{align} P(0.9 \leq \overline{X} \leq 1.1) &\approx \Phi\left(0.1\sqrt{n}\right) - \Phi\left(-0.1\sqrt{n}\right) \\ &=2\Phi\left(0.1\sqrt{n}\right)-1 \quad (\textrm{since} \quad \Phi(-x)=1-\Phi(x)). \end{align} We need to have \begin{align} 2\Phi\left(0.1\sqrt{n}\right)-1 &\geq 0.95, \qquad \textrm{so} \quad \Phi \left(0.1\sqrt{n}\right)\geq 0.975 . \end{align} Thus, \begin{align} 0.1\sqrt{n} &\geq \Phi^{-1}(0.975)=1.96 \\ \sqrt{n} &\geq 19.6 \\ n &\geq 384.16 \end{align} Since $n$ is an integer, we conclude $n \geq 385$.
Problem
For this problem and the next, you will need to be familiar with moment generating functions (Section 6.1.3). The goal here is to prove the (weak) law of large numbers using MGFs. In particular, let $X_1$, $X_2, \dots, X_n$ be i.i.d. random variables with expected value $EX_i=\mu < \infty$ and MGF $M_X(s)$ that is finite on some interval $[-c,c]$ where $c>0$ is a constant. As usual, let \begin{align}%\label{} \overline{X}=\frac{X_1+X_2+\dots+X_n}{n}. \end{align} Prove \begin{align} \lim_{n\rightarrow \infty} M_{\overline{X}}(s)=e^{{\large s\mu}}, \qquad \textrm{ for all }s \in [-c,c]. \end{align} Since this is the MGF of constant random variable $\mu$, we conclude that the distribution of $\overline{X}$ converges to $\mu$. Hint: Use the result of Problem 8 in Section 6.1.6: for a random variable $X$ with a well-defined MGF, $M_X(s)$, we have \begin{equation} \lim_{n\rightarrow\infty} \left[M_X\left(\frac{s}{n}\right)\right]^{\large n}=e^{{\large sEX}}. \end{equation}
- Solution
- We have \begin{align}%\label{} M_{\overline{X}}(s)&=E[e^{s\overline{X}}]\\ &=E[e^{s\frac{X_1+X_2+\cdots+X_n}{n}}] \\ &= E[e^{s\frac{X_1}{n}}e^{s\frac{X_2}{n}}\cdots e^{s\frac{X_n}{n}}]\\ &=E[e^{\frac{sX_1}{n}}]\cdot E[e^{\frac{sX_2}{n}}]\cdots E[e^{\frac{sX_n}{n}}] \quad (\textrm{since the $X_i$'s are independent}) \\ &=\left[M_{X}\left(\frac{s}{n}\right)\right]^n \quad (\textrm{since the $X_i$'s are identically distributed}) \end{align} Therefore, \begin{align} \lim_{n \rightarrow \infty} M_{\overline{X}}(s) &= \lim_{n \rightarrow \infty} [M_{X}\left(\frac{s}{n}\right)]^n\\ &=e^{{\large sEX}} \quad (\textrm{by the hint})\\ &=e^{{\large s\mu}}. \end{align} Note that $e^{{\large s\mu}}$ is the MGF of a constant random variable $Y$, with value $Y=\mu$. This means that the random variable $\overline{X}$ converges to $\mu$ (in distribution).
Problem
The goal in this problem is to prove the central limit theorem using MGFs. In particular, let $X_1$, $X_2$, ... , $X_n$ be i.i.d. random variables with expected value $EX_i=\mu < \infty$, $\mathrm{Var}(X_i)=\sigma^2< \infty$, and MGF $M_X(s)$ that is finite on some interval $[-c,c]$, where $c>0$ is a constant. As usual, let
\begin{align}%\label{} Z_n=\frac{\overline{X}-\mu}{\sigma / \sqrt{n}}=\frac{X_1+X_2+\cdots+X_n-n\mu}{\sqrt{n} \sigma}. \end{align} Prove \begin{align} \lim_{n\rightarrow \infty} M_{Z_n}(s)=e^{{\large \frac{s^2}{2}}}, \qquad \textrm{ for all }s \in [-c,c]. \end{align} Since this is the MGF of a standard normal random variable, we conclude that the distribution of $Z_n$ converges to the standard normal random variable.Hint: Use the result of Problem 9 in Section 6.1.6: for a random variable $Y$ with a well-defined MGF, $M_Y(s)$, and $EY=0$, $\mathrm{Var}(Y)=1$, we have \begin{align}%\label{} \lim_{n \rightarrow \infty} \left[M_Y\left(\frac{s}{\sqrt{n}}\right)\right]^n=e^{{\large \frac{s^2}{2}}}. \end{align}
- Solution
- Let $Y_i$'s be the normalized versions of the $X_i$'s, i.e., \begin{align}%\label{} Y_i=\frac{X_i-\mu}{\sigma}. \end{align} Then, $Y_i$'s are i.i.d. and \begin{align}%\label{} EY_i&=0,\\ \mathrm{Var}(Y_i)&=1. \end{align} We also have \begin{align}%\label{} Z_n&=\frac{\overline{X}-\mu}{\frac{\sigma}{\sqrt{n}}} \\ &=\frac{Y_1+Y_2+\cdots+Y_n}{\sqrt{n}}. \end{align} Thus, we have \begin{align}%\label{} M_{Z_n}(s)&=E[e^{s\frac{Y_1+Y_2+\cdots+Y_n}{\sqrt{n}}}] \\ &=E[e^{\frac{sY_1}{\sqrt{n}}}] \cdot E[e^{\frac{sY_2}{\sqrt{n}}}] \cdots E[e^{\frac{sY_n}{\sqrt{n}}}] \quad (\textrm{the since $Y_i$'s are independent}) \\ &=M_{Y_1}\left(\frac{s}{\sqrt{n}}\right)^n \quad (the Y_i\textrm{'s are identically distributed}). \end{align} Thus, we conclude \begin{align}%\label{} \lim_{n \rightarrow \infty} M_{Z_n}(s)&=\lim_{n \rightarrow \infty} M_{Y_1}\left(\frac{s}{\sqrt{n}}\right)^n \\ &=e^{\frac{\large{s^2}}{\large{2}}} \quad (\textrm{by the hint}). \end{align} Since this is the MGF of a standard normal random variable, we conclude the CDF of $Z_n$ converges to the standard normal CDF.