10.1.5 Gaussian Random Processes
Here, we will briefly introduce normal (Gaussian) random processes. We will discuss some examples of Gaussian processes in more detail later on. Many important practical random processes are subclasses of normal random processes.
First, let us remember a few facts about Gaussian random vectors. As we saw before, random variables $X_1$, $X_2$,..., $X_n$ are said to be jointly normal if, for all $a_1$,$a_2$,..., $a_n$ $\in \mathbb{R}$, the random variable
\begin{align}%\label{} a_1X_1+a_2X_2+...+a_nX_n \end{align} is a normal random variable. Also, a random vector \begin{equation} \nonumber \textbf{X} = \begin{bmatrix} X_1 \\%[5pt] X_2 \\%[5pt] . \\[-10pt] . \\[-10pt] . \\[5pt] X_n \end{bmatrix} \end{equation} is said to be normal or Gaussian if the random variables $X_1$, $X_2$,..., $X_n$ are jointly normal. An important property of jointly normal random variables is that their joint PDF is completely determined by their mean and covariance matrices. More specifically, for a normal random vector X with mean $\mathbf{m}$ and covariance matrix C, the PDF is given by \begin{align*} f_{\mathbf{X}}(\mathbf{x})=\frac{1}{(2\pi)^{\frac{n}{2}} \sqrt{\det\textbf{C}}} \exp \left\{-\frac{1}{2} (\textbf{x}-\textbf{m})^T \mathbf{C}^{-1}(\textbf{x}-\textbf{m}) \right\}. \end{align*} Now, let us define Gaussian random processes.
A random process $\big\{X(t), t \in J \big\}$ is said to be a Gaussian (normal) random process if, for all
\begin{align}%\label{}
& t_1,t_2, \dots, t_n \in J,
\end{align}
the random variables $X(t_1)$, $X(t_2)$,..., $X(t_n)$ are jointly normal.
Example
Let $X(t)$ be a zero-mean WSS Gaussian process with $R_X(\tau)=e^{-\tau^2}$, for all $\tau \in \mathbb{R}$.
- Find $P\big(X(1) \lt 1\big)$.
- Find $P\big(X(1)+X(2) \lt 1\big)$.
- Solution
-
- $X(1)$ is a normal random variable with mean $E[X(1)]=0$ and variance \begin{align*}%\label{} \textrm{Var}\big(X(1)\big)&=E[X(1)^2]\\ &=R_X(0)=1. \end{align*} Thus, \begin{align*}%\label{} P\big(X(1) \lt 1\big)&=\Phi \left(\frac{1-0}{1} \right)\\ &=\Phi(1) \approx 0.84 \end{align*}
- Let $Y=X(1)+X(2)$. Then, $Y$ is a normal random variable. We have \begin{align*}%\label{} EY &=E[X(1)]+E[X(2)]\\ &=0; \end{align*} \begin{align*}%\label{} \textrm{Var}(Y) &=\textrm{Var}\big(X(1)\big)+\textrm{Var}\big(X(2)\big)+2 \textrm{Cov}\big(X(1),X(2)\big). \end{align*} Note that \begin{align*}%\label{} \textrm{Var}\big(X(1)\big)&=E[X(1)^2]-E[X(1)]^2\\ &=R_X(0)- \mu_X^2\\ &=1-0=1=\textrm{Var}\big(X(2)\big); \end{align*} \begin{align*}%\label{} \textrm{Cov}\big(X(1),X(2)\big)&=E[X(1)X(2)]-E[X(1)]E[X(2)]\\ &=R_X(-1)-\mu_X^2\\ &=e^{-1} -0=\frac{1}{e}. \end{align*} Therefore, \begin{align*}%\label{} \textrm{Var}(Y) &=2+\frac{2}{e}. \end{align*} We conclude $Y \sim N(0,2+\frac{2}{e})$. Thus, \begin{align*}%\label{} P\big(Y \lt 1\big)&=\Phi \left(\frac{1-0}{\sqrt{2+\frac{2}{e}}} \right)\\ &=\Phi(0.6046) \approx 0.73 \end{align*}
-
An important property of normal random processes is that wide-sense stationarity and strict-sense stationarity are equivalent for these processes. More specifically, we can state the following theorem.
Theorem Consider the Gaussian random processes $\big\{X(t), t \in \mathbb{R}\big\}$. If $X(t)$ is WSS, then $X(t)$ is a stationary process.
- Proof
-
We need to show that, for all $t_1,t_2,\cdots, t_r \in \mathbb{R}$ and all $\Delta \in \mathbb{R}$, the joint CDF of
\begin{align}%\label{}
X(t_1), X(t_2), \cdots, X(t_r)
\end{align}
is the same as the joint CDF of
\begin{align}%\label{}
X(t_1+\Delta), X(t_2+\Delta), \cdots, X(t_r+\Delta).
\end{align}
Since these random variables are jointly Gaussian, it suffices to show that the mean vectors and the covariance matrices are the same. To see this, note that $X(t)$ is a WSS process, so
\begin{align}%\label{}
\mu_X(t_i)=\mu_X(t_j)=\mu_X, \quad \textrm{for all }i,j,
\end{align}
and \begin{align}%\label{} C_X(t_i+\Delta,t_j+\Delta)=C_X(t_i,t_j)=C_X(t_i-t_j), \quad \textrm{for all }i,j. \end{align} From the above, we conclude that the mean vector and the covariance matrix of \begin{align}%\label{} X(t_1), X(t_2), \cdots, X(t_r) \end{align} is the same as the mean vector and the covariance matrix of \begin{align}%\label{} X(t_1+\Delta), X(t_2+\Delta), \cdots, X(t_r+\Delta). \end{align}
-
We need to show that, for all $t_1,t_2,\cdots, t_r \in \mathbb{R}$ and all $\Delta \in \mathbb{R}$, the joint CDF of
\begin{align}%\label{}
X(t_1), X(t_2), \cdots, X(t_r)
\end{align}
is the same as the joint CDF of
\begin{align}%\label{}
X(t_1+\Delta), X(t_2+\Delta), \cdots, X(t_r+\Delta).
\end{align}
Since these random variables are jointly Gaussian, it suffices to show that the mean vectors and the covariance matrices are the same. To see this, note that $X(t)$ is a WSS process, so
\begin{align}%\label{}
\mu_X(t_i)=\mu_X(t_j)=\mu_X, \quad \textrm{for all }i,j,
\end{align}
Similarly, we can define jointly Gaussian random processes.
Two random processes $\big\{X(t), t \in J \big\}$ and $\big\{Y(t), t \in J' \big\}$ are said to be jointly Gaussian (normal), if for all
\begin{align}%\label{}
& t_1,t_2, \dots, t_m \in J\\
& \quad \quad \textrm{and}\\
& t'_1,t'_2, \dots, t'_n \in J',
\end{align}
the random variables
\begin{align}%\label{}
& X(t_1), X(t_2), \cdots, X(t_m), Y(t'_1), Y(t'_2), \cdots, Y(t'_n)
\end{align}
are jointly normal.
Note that from the properties of jointly normal random variables, we can conclude that if two jointly Gaussian random processes $X(t)$ and $Y(t)$ are uncorrelated, i.e.,
\begin{align*}%\label{}
C_{XY}(t_1,t_2)=0, \quad \textrm{for all }t_1,t_2,
\end{align*}
then $X(t)$ and $Y(t)$ are two independent random processes.