10.1.2 Mean and Correlation Functions
Mean Function of a Random Process:
For a random process $\big\{X(t), t \in J \big\}$, the mean function $\mu_X(t):J \rightarrow \mathbb{R}$, is defined as \begin{align}%\label{} \mu_X(t)=E[X(t)] \end{align}
Example
Find the mean functions for the random processes given in Examples 10.1 and 10.2.
- Solution
- For $\big\{X_n, n=0,1,2, \cdots \big\}$ given in Example 10.1, we have \begin{align}%\label{} \mu_X(n)&=E[X_n]\\ &=1000 E[Y^n] \quad \big(\textrm{where } Y=1+R \quad \sim \quad Uniform(1.04,1.05) \big)\\ &=1000 \int_{1.04}^{1.05} 100 y^n \quad dy \quad (\textrm{by LOTUS})\\ &=\frac{10^5}{n+1} \bigg[y^{n+1}\bigg]_{1.04}^{1.05}\\ &=\frac{10^5}{n+1} \bigg[(1.05)^{n+1}-(1.04)^{n+1}\bigg], \quad \textrm{ for all }n \in \{0,1,2,\cdots\}. \end{align} For $\big\{X(t), t \in [0,\infty) \big\}$ given in Example 10.2, we have \begin{align}%\label{} \mu_X(t)&=E[X(t)]\\ &=E[A+Bt]\\ &=E[A]+E[B]t\\ &=1+t, \quad \textrm{ for all }t \in [0,\infty). \end{align}
Autocorrelation and Autocovariance:
The mean function $\mu_X(t)$ gives us the expected value of $X(t)$ at time $t$, but it does not give us any information about how $X(t_1)$ and $X(t_2)$ are related. To get some insight on the relation between $X(t_1)$ and $X(t_2)$, we define correlation and covariance functions.
For a random process $\big\{X(t), t \in J \big\}$, the autocorrelation function or, simply, the correlation function, $R_X(t_1,t_2)$, is defined by
\begin{align}%\label{}
R_X(t_1,t_2)=E[X(t_1)X(t_2)], \quad \textrm{for }t_1,t_2 \in J.
\end{align}
For a random process $\big\{X(t), t \in J \big\}$, the autocovariance function or, simply, the covariance function, $C_X(t_1,t_2)$, is defined by \begin{align}%\label{} C_X(t_1,t_2)&=\textrm{Cov}\big(X(t_1),X(t_2)\big)\\ &=R_X(t_1,t_2)-\mu_X(t_1) \mu_X(t_2), \quad \textrm{for }t_1,t_2 \in J. \end{align}
Note that if we let $t_1=t_2=t$, we obtain
\begin{align}%\label{}
R_X(t,t)&=E[X(t)X(t)]\\
&=E[X(t)^2], \quad \textrm{for }t \in J;
\end{align}
\begin{align}%\label{}
C_X(t,t)&=\textrm{Cov}\big(X(t),X(t)\big)\\
&=\textrm{Var}\big(X(t)\big) ,\quad \textrm{for }t \in J.
\end{align}
If $t_1 \neq t_2$, then the covariance function $C_X(t_1,t_2)$ gives us some information about how $X(t_1)$ and $X(t_2)$ are statistically related. In particular, note that
\begin{align}%\label{}
C_X(t_1,t_2)&=E\bigg[\bigg(X(t_1)-E\big[X(t_1)\big]\bigg)\bigg(X(t_2)-E\big[X(t_2)\big]\bigg)\bigg].
\end{align}
Intuitively, $C_X(t_1,t_2)$ shows how $X(t_1)$ and $X(t_2)$ move relative to each other.
If large values of $X(t_1)$ tend to imply large values of $X(t_2)$, then $\big(X(t_1)-E\big[X(t_1)\big]\big)\big(X(t_2)-E\big[X(t_2)\big]\big)$ is positive on average. In this case, $C_X(t_1,t_2)$ is positive, and we say $X(t_1)$ and $X(t_2)$ are positively correlated. On the other hand, if large values of $X(t_1)$ imply small values of $X(t_2)$, then $\big(X(t_1)-E\big[X(t_1)\big]\big)\big(X(t_2)-E\big[X(t_2)\big]\big)$ is negative on average, and we say $X(t_1)$ and $X(t_2)$ are negatively correlated.
If $C_X(t_1,t_2)=0$, then $X(t_1)$ and $X(t_2)$ are uncorrelated.
For a random process $\big\{X(t), t \in J \big\}$, the autocovariance function or, simply, the covariance function, $C_X(t_1,t_2)$, is defined by \begin{align}%\label{} C_X(t_1,t_2)&=\textrm{Cov}\big(X(t_1),X(t_2)\big)\\ &=R_X(t_1,t_2)-\mu_X(t_1) \mu_X(t_2), \quad \textrm{for }t_1,t_2 \in J. \end{align}
Example
Find the correlation functions and covariance functions for the random processes given in Examples 10.1 and 10.2.
- Solution
- For $\big\{X_n, n=0,1,2, \cdots \big\}$ given in Example 10.1, we have \begin{align}%\label{} R_X(m,n)&=E[X_m X_n]\\ &=10^6 E[Y^m Y^n] \quad \big(\textrm{where } Y=1+R \quad \sim \quad Uniform(1.04,1.05) \big)\\ &=10^6 \int_{1.04}^{1.05} 100 y^{(m+n)} \quad dy \quad (\textrm{by LOTUS})\\ &=\frac{10^8}{m+n+1} \bigg[y^{m+n+1}\bigg]_{1.04}^{1.05}\\ &=\frac{10^8}{m+n+1} \bigg[(1.05)^{m+n+1}-(1.04)^{m+n+1}\bigg], \quad \textrm{ for all }m,n \in \{0,1,2,\cdots\}. \end{align} To find the covariance function, we write \begin{align}%\label{} C_X(m,n)&=R_X(m,n)-E[X_m] E[X_n]\\ &=\frac{10^8}{m+n+1} \bigg[(1.05)^{m+n+1}-(1.04)^{m+n+1}\bigg]\\ &-\frac{10^{10}}{(m+1)(n+1)} \bigg[(1.05)^{m+1}-(1.04)^{m+1}\bigg] \bigg[(1.05)^{n+1}-(1.04)^{n+1}\bigg]. \end{align} For $\big\{X(t), t \in [0,\infty) \big\}$ given in Example 10.2, we have \begin{align}%\label{} R_X(t_1,t_2)&=E[X(t_1)X(t_2)]\\ &=E[(A+Bt_1)(A+Bt_2)]\\ &=E[A^2]+E[AB](t_1+t_2)+E[B^2]t_1t_2\\ &=2+E[A]E[B](t_1+t_2)+2t_1t_2 \quad (\textrm{since $A$ and $B$ are independent})\\ &=2+t_1+t_2+2t_1t_2, \quad \textrm{ for all }t_1,t_2 \in [0,\infty). \end{align} Finally, to find the covariance function for $X(t)$, we can write \begin{align}%\label{} C_X(t_1,t_2)&=R_X(t_1,t_2)-E[X(t_1)]E[X(t_2)]\\ &=2+t_1+t_2+2t_1t_2-(1+t_1)(1+t_2)\\ &=1+t_1t_2, \quad \textrm{ for all }t_1,t_2 \in [0,\infty). \end{align}