Determine conditional and marginal probability functions for discrete random variables only

Determine conditional and marginal probability functions for discrete random variables only

In the previous reading, we looked at joint discrete distribution functions. In this reading, we will determine conditional and marginal probability functions from joint discrete probability functions.

Suppose that we know the joint probability distribution of two discrete random variables, \(X\) and \(Y\), and that we wish to obtain the individual probability mass function for \(X\) or \(Y\). These individual mass functions of \(X\) and \(Y\) are referred to as the marginal probability distribution of \(X\) and \(Y\), respectively.

Once we have these marginal distributions, we can analyze the two variables separately.

Recall that the joint probability mass function of the discrete random variables \(X\) and \(Y\), denoted as \(f_(x,y)\), satisfies the following properties:

  1. \(f_ \left(x,y\right)\geq 0\)
  2. \(\sum_\sum_\)

Definition

Let \(X\) and \(Y\) have the joint probability mass function \(f(x,y)\) defined on the space \(S\).

The probability mass function of \(X\) alone, which is called the marginal probability mass function of X, is given by:

Where the summation is taken over all possible values of \(y\) for each given \(x\) in space \(S_X\).

Similarly, the marginal probability mass function of Y is given by:

where the summation is taken over all possible values of \(x\) for each given \(y\) in space \(S_y\).

Example 1: Marginal Probability Distribution

The data for an insurance company’s policyholders is characterized as follows:

$$ \begin
\bf & \bf & & & \\
\textbf & \textbf & \textbf & & \\
& 0 & 50 & 100 & \textbf \\ \hline
) > & 0.18 & 0.09 & 0.03 & 0.30 \\ \hline
)> & 0.12 & 0.30 & 0.18 & 0.60 \\ \hline
)> & 0.02 & 0.03 & 0.05 & 0.10 \\ \hline
\textbf & 0.32 & 0.42 & 0.26 & 1.00
\end $$

Given the joint probability distribution above, we may be interested to know what are the probabilities of, say: i) a low-risk individual; ii) a medium-risk individual; iii) a high-risk individual; or the probability of getting: i) a loss amount of 0; ii) a loss amount of 50; iii) a loss amount of 100.

To find the marginal probability of \(R\), i.e., the probability of a low-risk individual, a medium-risk individual, or a high-risk individual, we will add across each column, i.e.,

$$ P\left(R=0\right)=P\left(R=0\cap X=0\right)+P\left(R=0\cap X=50\right)+P\left(R=0\cap X=100\right) $$

We can do the same to find probabilities of getting different loss amounts, i.e., the marginal probability of \(X\):

$$ P\left(X=0\right)=P\left(R=0\cap X=0\right)+P\left(R=1\cap X=0\right)+P\left(R=2\cap X=0\right) $$

The marginal distributions of \(R\) and \(X\) are as follows:

$$ \begin
R & Pr=f_R(r) \\ \hline
0 & 0.30 \\ \hline
1 & 0.60 \\ \hline
2 & 0.10
\end \\
\begin
X & Pr=f_R(r) \\ \hline
0 & 0.32 \\ \hline
50 & 0.42 \\ \hline
100 & 0.26
\end $$

Example 2: Marginal Probability Mass Function

Suppose that the joint p.m.f of \(X\) and \(Y\) is given as:

$$ f\left(x,y\right)=\frac,\ \ \ x=1, 2\ \ \ y=1, 2, 3 $$

  1. Determine the marginal probability mass function of \(X\).
  2. Determine the marginal probability mass function of \(Y\).

Solution

Example 3: Marginal Probability Mass Function

\(X\) and \(Y\) have the joint probability function as given in the table below:

Determine the probability distribution of \(X\).

Solution

So that the marginal probabilities are:

$$ \begin
P\left(X=1\right) & =0.12+0.13+0.18+0.17=0.6 \\
P\left(X=2\right) & =0.15+0.05+0.06+0.04=0.3 \\
P\left(X=3\right) & =0.03+0.07=0.1
\end $$

Therefore, the marginal probability distribution of \(X\) is:

Conditional Discrete Distributions

Conditional probability is a key part of Baye’s theorem. In plain language, it is the probability of one thing being true, given that another thing is true. It differs from joint probability, which is the probability that both things are true without knowing that one of them must be true. For instance:

We can use a Euler diagram to illustrate the difference between conditional and joint probabilities. Note that, in the diagram, each large square has an area of 1.

Let \(X\) be the probability that a patient’s left kidney is infected, and let \(Y\) be the probability that the right kidney is infected. The green area on the left side of the diagram represents the probability that both of the patient’s kidneys are affected. This is the joint probability \((X, Y)\). If \(Y\) is true (e.g., given that the right kidney is infected), then the space of everything, not \(Y\), is dropped, and everything in \(Y\) is rescaled to the size of the original space. The rescaled green area on the right-hand side is now the conditional probability of \(X\) given \(Y\), expressed as \(P(X|Y)\). Put differently, this is the probability that the left kidney is infected if we know that the right kidney is affected.

It is worth noting that the conditional probability of \(X\) given \(Y\) is not necessarily equal to the conditional probability of \(Y\) given \(X\).

Recall that for the univariate case, the conditional distribution of an event \(A\) given \(B\) is defined by:

Where event \(B\) happened first and impacted how \(A\) occurred. We can extend this idea to multivariate distributions.

Definition

Let \(X\) and \(Y\) be discrete random variables with joint probability mass function, \(f(x, y)\) defined on the space \(S\).

Also, let \(f_x\left(x\right)\) and \(f_y\left(y\right)\) be the marginal distribution function of \(X\) and \(Y\), respectively.

The conditional probability mass function of X, given that \(=\), is given by:

Similarly, the conditional probability mass function of Y, given that \(=\), is defined by;

Example 1: Conditional Discrete Distribution

The data for an insurance company’s policyholders is characterized as follows:

$$ \begin
\bf R & \bf & & \\
\textbf & \textbf & \textbf & \\ \hline
& 0 & 50 & 100 \\ \hline
0 (\text) & 0.18 & 0.09 & 0.03 \\ \hline
1 (\text) & 0.12 & 0.30 & 0.18 \\ \hline
2 (\text) & 0.02 & 0.03 & 0.05
\end $$

We can find the conditional probabilities for \(X|R=0\):

Now, the conditional distribution of \(X|R=0\) is:

Example 2: Conditional Discrete Probability Function

Let \(X\) be the number of days of sickness over the last year, and let \(Y\) be the number of days of sickness this year. \(X\) and \(Y\) are jointed distributed as in the table below:

  1. the marginal distribution of \(X\);
  2. the marginal distribution of \(Y\); and
  3. the conditional distribution of \(|=\).

Solution

  1. The marginal probability mass function of \(X\) is given by: $$ P\left(X=x\right)=\sum_$$ Therefore, we have: $$ \begin P\left(X=0\right) & =0.1+0.1+0.2=0.4 \\ P\left(X=1\right) &=0.1+0.1+0.1=0.3 \\ P\left(X=2\right)& =0+0.2+0.1=0.3 \end $$ When represented in a table, the marginal distribution of \(X\) is: $$ \begin
    X & 0 & 1 & 2 \\ \hline
    P(X=x) & 0.4 & 0.3 & 0.3
    \end $$
  2. Similarly, $$ P\left(Y=y\right)=\sum_$$ Therefore, we have: $$ \begin P\left(Y=1\right) &=0.1+0.1+0=0.2 \\ P\left(Y=2\right) & =0.1+0.1+0.2=0.4 \\ P\left(Y=3\right)&=0.2+0.1+0.1=0.4 \end $$ Therefore, the marginal distribution of \(Y\) is: $$ \begin
    Y & 1 & 2 & 3 \\ \hline
    P(Y=y) & 0.2 & 0.4 & 0.4
    \end $$
  3. Using the definition of conditional probability: $$ \begin
    P\left(Y=1\middle|X=2\right) & =\frac=\frac=0 \\
    P\left(Y=2\middle|X=2\right) & =\frac=\frac=0.67 \\
    P\left(Y=3\middle|X=2\right) & =\frac=\frac=0.33
    \end $$ The conditional distribution is, therefore, $$ p\left(y\middle|x=2\right)= \left\ < \begin0, & y =1 \\ 0.67 & y=2 \\ 0.33, & y=3 \end \right. $$

Example 3: Conditional Discrete Probability Function

An actuary determines that the number of monthly accidents in two towns, \(M\) and \(N\), is jointly distributed as

Let \(X\) and \(Y\) be the number of monthly accidents in towns \(M\) and \(N\), respectively.

  1. \(h\left(y\middle|x\right)\)
  2. \(g(x|y)\)

Solution

We can use conditional distributions to find conditional probabilities.

If we find all the probabilities for this conditional probability function, we would see that they behave similarly to the joint probability mass functions seen in the previous reading.

Now, let’s keep \(=\) fixed and check this:

Summing up the above probabilities, we get

Note that the same occurs for the conditional distribution of \(X\) given \(Y\), \((|)\).

Thus, \((|)\) and \((|)\) both satisfy the conditions of a probability mass function, and we can do the same operations we did on joint discrete probability functions.

Question

  1. \(\frac\)
  2. \( \frac\)
  3. \( \frac\)
  4. \( \frac\)
  5. \(\frac\)

Learning Outcome

Topic 3. b: Multivariate Random Variables – Determine conditional and marginal probability functions for discrete random variables only.