5.1.1 Joint Probability Mass Function (PMF)

Remember that for a discrete random variable $X$, we define the PMF as $P_X(x)=P(X=x)$. Now, if we have two random variables $X$ and $Y$, and we would like to study them jointly, we define the joint probability mass function as follows:
The joint probability mass function of two discrete random variables $X$ and $Y$ is defined as \begin{align}%\label{} \nonumber P_{XY}(x,y)=P(X=x, Y=y). \end{align}
Note that as usual, the comma means "and," so we can write \begin{align}%\label{} \nonumber P_{XY}(x,y)&=P(X=x, Y=y) \\ \nonumber &= P\big((X=x)\textrm{ and }(Y=y)\big). \end{align} We can define the joint range for $X$ and $Y$ as \begin{align}%\label{} \nonumber R_{XY}=\{(x,y) | P_{XY}(x,y)>0\}. \end{align} In particular, if $R_X=\{x_1,x_2,... \}$ and $R_Y=\{y_1,y_2,...\}$, then we can always write \begin{align}%\label{} \nonumber R_{XY} & \subset R_X \times R_Y \\ \nonumber &= \{(x_i,y_j) | x_i \in R_X, y_j \in R_Y \}. \end{align} In fact, sometimes we define $R_{XY}=R_X \times R_Y$ to simplify the analysis. In this case, for some pairs $(x_i,y_j)$ in $R_X \times R_Y$, $P_{XY}(x_i,y_j)$ might be zero. For two discrete random variables $X$ and $Y$, we have
\begin{align}%\label{} \nonumber \sum_{(x_i,y_j) \in R_{XY}} P_{XY}(x_i,y_j)=1 \end{align}
We can use the joint PMF to find $P\big( (X,Y) \in A \big)$ for any set $A \subset \mathbb{R}^2$. Specifically, we have
\begin{align}%\label{} \nonumber P\big( (X,Y) \in A \big)=\sum_{(x_i,y_j) \in (A \cap R_{XY})} P_{XY}(x_i,y_j) \end{align}
Note that the event $X=x$ can be written as $\{(x_i,y_j): x_i=x, y_j \in R_Y \}$. Also, the event $Y=y$ can be written as $\{(x_i,y_j): x_i\in R_X, y_j=y\}$. Thus, we can write \begin{align}%\label{} \nonumber P_{XY}(x,y)&=P(X=x, Y=y) \\ \nonumber &=P\big((X=x)\cap(Y=y)\big). \end{align}

Marginal PMFs

The joint PMF contains all the information regarding the distributions of $X$ and $Y$. This means that, for example, we can obtain PMF of $X$ from its joint PMF with $Y$. Indeed, we can write \begin{align}%\label{} \nonumber P_X(x) &= P(X=x)\\ \nonumber &=\sum_{y_j \in R_Y} P(X=x, Y=y_j) &\textrm{law of total probablity}\\ \nonumber &=\sum_{y_j \in R_Y} P_{XY}(x,y_j). \end{align} Here, we call $P_X(x)$ the marginal PMF of $X$. Similarly, we can find the marginal PMF of $Y$ as \begin{align}%\label{} \nonumber P_Y(Y)=\sum_{x_i \in R_X} P_{XY}(x_i,y). \end{align}

Marginal PMFs of $X$ and $Y$:

\begin{align}\label{Eq:marginals} \nonumber P_X(x)&=\sum_{y_j \in R_Y} P_{XY}(x,y_j), \hspace{20pt} \textrm{ for any } x \in R_X \\ P_Y(y)&=\sum_{x_i \in R_X} P_{XY}(x_i,y), \hspace{20pt} \textrm{ for any } y \in R_Y \hspace{40pt} (5.1) \end{align}

Let's practice these concepts by looking at an example.



Example
Consider two random variables $X$ and $Y$ with joint PMF given in Table 5.1.

Table 5.1 Joint PMF of $X$ and $Y$ in Example 5.1

 $Y = 0$$Y = 1$$Y = 2$
$X = 0$$\frac{1}{6}$$\frac{1}{4}$$\frac{1}{8}$
$X = 1$$\frac{1}{8}$$\frac{1}{6}$$\frac{1}{6}$

Figure 5.1 shows $P_{XY}(x,y)$.

Figure 5.1: Joint PMF of $X$ and $Y$ (Example 5.1).

  1. Find $P(X=0,Y\leq1)$.
  2. Find the marginal PMFs of $X$ and $Y$.
  3. Find $P(Y=1|X=0)$.
  4. Are $X$ and $Y$ independent?
  • Solution
      1. To find $P(X=0, Y \leq 1)$, we can write \begin{align}%\label{} \nonumber P(X=0, Y \leq 1) =P_{XY}(0,0)+ P_{XY}(0,1)=\frac{1}{6}+\frac{1}{4}=\frac{5}{12}. \end{align}
      2. Note that from the table, \begin{align}%\label{} \nonumber R_X=\{0,1\} \hspace{20pt}\textrm{ and }\hspace{20pt} R_Y=\{0,1,2\}. \end{align} Now we can use Equation 5.1 to find the marginal PMFs. For example, to find $P_X(0)$, we can write \begin{align}%\label{} \nonumber P_X(0)&=P_{XY}(0,0)+P_{XY}(0,1)+P_{XY}(0,2)\\ \nonumber &=\frac{1}{6}+\frac{1}{4}+\frac{1}{8}\\ \nonumber &=\frac{13}{24}. \end{align} We obtain \begin{equation} \nonumber P_X(x) = \left\{ \begin{array}{l l} \frac{13}{24} & \quad x=0 \\ & \quad \\ \frac{11}{24} & \quad x=1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} \begin{equation} \nonumber P_Y(y) = \left\{ \begin{array}{l l} \frac{7}{24} & \quad y=0 \\ & \quad \\ \frac{5}{12} & \quad y=1 \\ & \quad \\ \frac{7}{24} & \quad y=2 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation}
      3. Find $P(Y=1 | X=0)$: Using the formula for conditional probability, we have \begin{align}%\label{} \nonumber P(Y=1 | X=0)&=\frac{P(X=0, Y=1)}{P(X=0)}\\ \nonumber &=\frac{P_{XY}(0,1)}{P_X(0)}\\ \nonumber &=\frac{\frac{1}{4}}{\frac{13}{24}}=\frac{6}{13}. \end{align}
      4. Are $X$ and $Y$ independent? $X$ and $Y$ are not independent, because as we just found out \begin{align}%\label{} \nonumber P(Y=1|X=0)=\frac{6}{13} \neq P(Y=1)=\frac{5}{12}. \end{align} Caution: If we want to show that $X$ and $Y$ are independent, we need to check that $P(X=x_i,Y=y_j)=P(X=x_i)P(Y=y_j)$, for all $x_i \in R_X$ and all $y_j \in R_Y$. Thus, even if in the above calculation we had found $P(Y=1 | X=0)= P(Y=1)$, we would not yet have been able to conclude that $X$ and $Y$ are independent. For that, we would need to check the independence condition for all $x_i \in R_X$ and all $y_j \in R_Y$.


The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover