8.5.5 Solved Problems
Consider the following observed values of $(x_i,y_i)$:
\begin{equation} (-1,6), \quad (0,3), \quad (1,2), \quad (2,-1) \end{equation}- Find the estimated regression line \begin{align} \hat{y} = \hat{\beta_0}+\hat{\beta_1} x, \end{align} based on the observed data.
- For each $x_i$, compute the fitted value of $y_i$ using \begin{align} \hat{y}_i = \hat{\beta_0}+\hat{\beta_1} x_i. \end{align}
- Compute the residuals, $e_i=y_i-\hat{y}_i$.
- Find $R$-squared (the coefficient of determination).
- Solution
-
- We have
\begin{align}
&\overline{x}=\frac{-1+0+1+2}{4}=0.5,\\
&\overline{y}=\frac{6+3+2+(-1)}{4}=2.5,\\
&s_{xx}=(-1-0.5)^2+(0-0.5)^2+(1-0.5)^2+(2-0.5)^2=5,\\
&s_{xy}=(-1-0.5)(6-2.5)+(0-0.5)(3-2.5)\\
& \quad +(1-0.5)(2-2.5)+(2-0.5)(-1-2.5)=-11.
\end{align}
Therefore, we obtain
\begin{align}
&\hat{\beta_1}=\frac{s_{xy}}{s_{xx}}=\frac{-11}{5}=-2.2,\\
&\hat{\beta_0}=2.5-(-2.2) (0.5)=3.6
\end{align}
The following MATLAB code can be used to obtain the estimated regression line
x=[-1;0;1;2];
x0=ones(size(x));
y=[6;3;2;-1];
beta = regress(y,[x0,x]); - The fitted values are given by \begin{align} \hat{y}_i = 3.6-2.2 x_i, \end{align} so we obtain \begin{align} \hat{y}_1 =5.8, \quad \hat{y}_2 =3.6, \quad \hat{y}_3 =1.4, \quad \hat{y}_4 =-0.8 \end{align}
- We have \begin{align} &e_1=y_1-\hat{y}_1=6-5.8=0.2,\\ &e_2=y_2-\hat{y}_2=3-3.6=-0.6,\\ &e_3=y_3-\hat{y}_3=2-1.4=0.6,\\ &e_4=y_4-\hat{y}_4=-1-(-0.8)=-0.2 \end{align}
- We have \begin{align} &s_{yy}=(6-2.5)^2+(3-2.5)^2+(2-2.5)^2+(-1-2.5)^2=25. \end{align} We conclude \begin{align} r^2=\frac{(-11)^2}{5 \times 25} \approx 0.968 \end{align}
- We have
\begin{align}
&\overline{x}=\frac{-1+0+1+2}{4}=0.5,\\
&\overline{y}=\frac{6+3+2+(-1)}{4}=2.5,\\
&s_{xx}=(-1-0.5)^2+(0-0.5)^2+(1-0.5)^2+(2-0.5)^2=5,\\
&s_{xy}=(-1-0.5)(6-2.5)+(0-0.5)(3-2.5)\\
& \quad +(1-0.5)(2-2.5)+(2-0.5)(-1-2.5)=-11.
\end{align}
Therefore, we obtain
\begin{align}
&\hat{\beta_1}=\frac{s_{xy}}{s_{xx}}=\frac{-11}{5}=-2.2,\\
&\hat{\beta_0}=2.5-(-2.2) (0.5)=3.6
\end{align}
The following MATLAB code can be used to obtain the estimated regression line
-
Problem
Consider the model
\begin{align} Y = \beta_0+\beta_1 X +\epsilon, \end{align} where $\epsilon$ is a $N(0,\sigma^2)$ random variable independent of $X$. Let also \begin{align} \hat{Y} = \beta_0+\beta_1 X. \end{align} Show that \begin{align} E[(Y-EY)^2] = E[(\hat{Y}-EY)^2]+E[(Y-\hat{Y})^2]. \end{align}- Solution
- Since $X$ and $\epsilon$ are independent, we can write Equation 8.10, we conclude \begin{align} E[(Y-EY)^2] = E[(\hat{Y}-EY)^2]+E[(Y-\hat{Y})^2]. \end{align} \begin{align} \textrm{Var}(Y) = \beta_1^2 \textrm{Var}(X) +\textrm{Var}(\epsilon) \hspace{50pt} (8.10) \end{align} Note that, \begin{align} \hat{Y}-EY &= (\beta_0+\beta_1 X)-(\beta_0+\beta_1 EX)\\ &=\beta_1(X-EX). \end{align} Therefore, \begin{align} E[(\hat{Y}-EY)^2]=\beta_1^2 \textrm{Var}(X). \end{align} Also, \begin{align} E[(Y-EY)^2] = \textrm{Var}(Y), \quad E[(Y-\hat{Y})^2]=\textrm{Var}(\epsilon). \end{align} Combining with
Problem
Show that, in a simple linear regression, the estimated coefficients $\hat{\beta_0}$ and $\hat{\beta_1}$ (least squares estimates of $\beta_0$ and $\beta_1$) satisfy the following equations
\begin{align} \sum_{i=1}^{n} e_i=0, \quad \sum_{i=1}^{n} e_i x_i =0, \quad \sum_{i=1}^{n} e_i \hat{y_i}=0, \end{align} where $e_i=y_i-\hat{y_i}=y_i-\hat{\beta_0}-\hat{\beta_1} x$. Hint: $\hat{\beta_0}$ and $\hat{\beta_1}$ satisfy Equation 8.8 and Equation 8.9. By cancelling the $(-2)$ factor, you can write \begin{align} \sum_{i=1}^{n} (y_i-\hat{\beta_0}-\hat{\beta_1} x_i)=0,\\ \sum_{i=1}^{n} (y_i-\hat{\beta_0}-\hat{\beta_1} x_i)x_i=0. \end{align} Use the above equations to show the desired equations.
- Solution
- We have \begin{align} \sum_{i=1}^{n} (y_i-\hat{\beta_0}-\hat{\beta_1} x_i)=0,\\ \sum_{i=1}^{n} (y_i-\hat{\beta_0}-\hat{\beta_1} x_i)x_i=0. \end{align} Since $e_i=y_i-\hat{\beta_0}-\hat{\beta_1} x$, we conclude \begin{align} \sum_{i=1}^{n} e_i=0,\\ \sum_{i=1}^{n} e_i x_i =0. \end{align} Moreover, \begin{align} \sum_{i=1}^{n} e_i \hat{y_i} &=\sum_{i=1}^n e_i (\hat{\beta_0}+\hat{\beta_1}x_i)\\ &=\hat{\beta_0} \sum_{i=1}^{n} e_i + \hat{\beta_1} \sum_{i=1}^{n} e_i x_i \\ &=0+0=0. \end{align}
Problem
Show that the coefficient of determination can also be obtained as
\begin{align} r^2=\frac{\sum_{i=1}^n(\hat{y_i}-\overline{y})^2}{\sum_{i=1}^n(y_i-\overline{y})^2}. \end{align}- Solution
- We know \begin{align} &\hat{y_i}=\beta_0+\beta_1 x_i,\\ &\overline{y}=\beta_0+\beta_1 \overline{x}. \end{align} Therefore, \begin{align} \sum_{i=1}^n(\hat{y_i}-\overline{y})^2 &=\sum_{i=1}^n(\beta_1 x_i-\beta_1 \overline{x})^2\\ &=\beta_1^2\sum_{i=1}^n( x_i-\overline{x})^2\\ &= \beta_1^2 s_{xx}. \end{align} Therefore, \begin{align} \frac{\sum_{i=1}^n(\hat{y_i}-\overline{y})^2}{\sum_{i=1}^n(y_i-\overline{y})^2}&=\frac{\beta_1^2 s_{xx}}{s_{yy}}\\ &=\frac{s^2_{xy}}{s_{xx}s_{yy}} \quad (\textrm{since } \beta_1=\frac{s_{xy}}{s_{xx}})\\ &=r^2. \end{align}
Problem
(The Method of Maximum Likelihood) This problem assumes that you are familiar with the maximum likelihood method discussed in Section 8.2.3. Consider the model
\begin{align} Y_i = \beta_0+\beta_1 x_i +\epsilon_i, \end{align} where $\epsilon_i$'s are independent $N(0,\sigma^2)$ random variables. Our goal is to estimate $\beta_0$ and $\beta_1$. We have the observed data pairs $(x_1,y_1)$, $(x_2,y_2)$, $\cdots$, $(x_n,y_n)$.- Argue that, for given values of $\beta_0$, $\beta_1$, and $x_i$, $Y_i$ is a normal random variable with mean $\beta_0+\beta_1 x_i$ and variance $\sigma^2$. Moreover, show that the $Y_i$'s are independent.
- Find the likelihood function \begin{align} \nonumber L(y_1, y_2, \cdots, y_n; \beta_0, \beta_1)=f_{Y_1 Y_2 \cdots Y_n}(y_1, y_2, \cdots, y_n; \beta_0, \beta_1). \end{align}
- Show that the maximum likelihood estimates of $\beta_0$ and $\beta_1$ are the same as the ones we obtained using the least squares method.
- Solution
-
- Given values of $\beta_0$, $\beta_1$, and $x_i$, $c=\beta_0+\beta_1 x_i$ is a constant. Therefore, $Y_i=c+\epsilon_i$ is a normal random variable with mean $c$ and variance $\sigma^2$. Also, since the $\epsilon_i$'s are independent, we conclude that $Y_i$'s are also independent random variables.
- By the previous part, for given values of $\beta_0$, $\beta_1$, and $x_i$, \begin{align} f_{Y_i}(y; \beta_0, \beta_1)=\frac{1}{\sqrt{2 \pi \sigma^2}} \exp\left\{-\frac{1}{2\sigma^2}(y-\beta_0-\beta_1 x_i)^2\right\}. \end{align} Therefore, the likelihood function is given by \begin{align} L(y_1, y_2, \cdots, y_n; \beta_0, \beta_1)&=f_{Y_1 Y_2 \cdots Y_n}(y_1, y_2, \cdots, y_n; \beta_0, \beta_1)\\ &=f_{Y_1}(y_1; \beta_0, \beta_1) f_{Y_2}(y_2; \beta_0, \beta_1) \cdots f_{Y_n}(y_n; \beta_0, \beta_1)\\ &=\frac{1}{(2 \pi \sigma^2)^{\frac{n}{2}}} \exp\left\{-\frac{1}{2\sigma^2}\sum_{i=1}^{n}(y-\beta_0-\beta_1 x_i)^2\right\}. \end{align}
- To find the maximum likelihood estimates (MLE) of $\beta_0$ and $\beta_1$, we need to find $\hat{\beta_0}$ and $\hat{\beta_1}$ such that the likelihood function \begin{align} L(y_1, y_2, \cdots, y_n; \beta_0, \beta_1)=\frac{1}{(2 \pi \sigma^2)^{\frac{n}{2}}} \exp\left\{-\frac{1}{2\sigma^2}\sum_{i=1}^{n}(y-\beta_0-\beta_1 x_i)^2\right\} \end{align} is maximized. This is equivalent to minimizing \begin{align} \sum_{i=1}^{n}(y-\beta_0-\beta_1 x_i)^2. \end{align} The above expression is the sum of the squared errors, $g(\beta_0, \beta_1)$ (Equation 8.7). Therefore, the maximum likelihood estimation for this model is the same as the least squares method.
-