Discrete Random Variables - StudyPulse
Boost Your VCE Scores Today with StudyPulse
8000+ Questions AI Tutor Help
Home Subjects Mathematical Methods Discrete Random Variables

Discrete Random Variables

Mathematical Methods
StudyPulse

Discrete Random Variables

Mathematical Methods
05 Apr 2025

Discrete Random Variables

Specification of Probability Distributions

A discrete random variable is a variable whose value can only take on a finite or countably infinite number of values. Examples include the number of heads when flipping a coin a fixed number of times, or the number of defective items in a batch.

The probability distribution of a discrete random variable specifies the probability of each possible value of the variable.

Methods of Specification:

  • Graphs: A bar graph can be used to represent the probability of each value. The x-axis represents the possible values of the random variable, and the y-axis represents the corresponding probabilities.
  • Tables: A table can list each possible value of the random variable and its associated probability. The sum of all probabilities must equal 1.
  • Probability Mass Function (PMF): A PMF, denoted by $p(x)$ or $Pr(X=x)$, gives the probability that the random variable $X$ takes on a specific value $x$. Mathematically, it must satisfy:
    • \$0 \le p(x) \le 1$ for all $x$
    • $\sum_{x} p(x) = 1$

Example:

Consider a random variable $X$ representing the number of heads when flipping a fair coin twice. The possible values are 0, 1, and 2. The probability distribution can be specified as follows:

x (Number of Heads) p(x)
0 0.25
1 0.50
2 0.25

The PMF can be written as:

$p(x) = \begin{cases} 0.25, & x = 0 \ 0.50, & x = 1 \ 0.25, & x = 2 \ 0, & \text{otherwise} \end{cases}$

Mean, Variance, and Standard Deviation

The mean (or expected value), denoted by $\mu$ or $E(X)$, represents the average value of the random variable. It is calculated as:

$$\mu = E(X) = \sum_{x} x \cdot p(x)$$

The variance, denoted by $\sigma^2$ or $Var(X)$, measures the spread of the distribution around the mean. It is calculated as:

$$\sigma^2 = Var(X) = E[(X - \mu)^2] = \sum_{x} (x - \mu)^2 \cdot p(x)$$

An alternative (computational) formula for variance is:

$$Var(X) = E(X^2) - [E(X)]^2 = \sum_{x} x^2 \cdot p(x) - \mu^2$$

The standard deviation, denoted by $\sigma$ or $SD(X)$, is the square root of the variance and provides a measure of the spread in the same units as the random variable.

$$\sigma = SD(X) = \sqrt{Var(X)}$$

Interpretation:

  • The mean represents the long-run average value of the random variable.
  • The variance and standard deviation indicate the variability or spread of the distribution. A larger variance/standard deviation indicates greater variability.

Example:

Using the previous example of flipping a fair coin twice:

$\mu = (0 \times 0.25) + (1 \times 0.50) + (2 \times 0.25) = 1$

$\sigma^2 = (0-1)^2 \times 0.25 + (1-1)^2 \times 0.50 + (2-1)^2 \times 0.25 = 0.5$

$\sigma = \sqrt{0.5} \approx 0.707$

Bernoulli Trials and Binomial Distribution

A Bernoulli trial is a random experiment with only two possible outcomes: success (S) or failure (F). The probability of success is denoted by $p$, and the probability of failure is $1-p$.

The binomial distribution, denoted by $Bi(n, p)$, models the number of successes in $n$ independent Bernoulli trials, where each trial has a probability of success $p$.

The PMF of the binomial distribution is given by:

$$P(X = k) = {n \choose k} p^k (1-p)^{n-k}$$

where $k$ is the number of successes (0, 1, 2, …, n) and ${n \choose k} = \frac{n!}{k!(n-k)!}$ is the binomial coefficient.

For a binomial distribution $Bi(n, p)$:

  • Mean: $\mu = E(X) = np$
  • Variance: $\sigma^2 = Var(X) = np(1-p)$
  • Standard Deviation: $\sigma = \sqrt{np(1-p)}$

Example:

Suppose we flip a fair coin 5 times. Let $X$ be the number of heads. Then $X \sim Bi(5, 0.5)$.

$P(X = 2) = {5 \choose 2} (0.5)^2 (0.5)^3 = 10 \times 0.25 \times 0.125 = 0.3125$

$\mu = 5 \times 0.5 = 2.5$

$\sigma^2 = 5 \times 0.5 \times 0.5 = 1.25$

$\sigma = \sqrt{1.25} \approx 1.118$

Effect of Parameter Variation on PMF

For the binomial distribution $Bi(n, p)$:

  • Varying n (number of trials): Increasing $n$ while keeping $p$ constant will generally broaden the distribution and shift the mean proportionally.
  • Varying p (probability of success):
    • If $p = 0.5$, the distribution is symmetric.
    • If $p < 0.5$, the distribution is skewed to the right (positive skew).
    • If $p > 0.5$, the distribution is skewed to the left (negative skew).

Diagrams would visually show how changes in n and p affect the shape and position of the binomial distribution’s PMF. Specifically, histograms showing distributions with different n and p values. Describing these diagrams in detail is key if the diagrams cannot be included.

Probability Calculations and Conditional Probability

Probabilities for specific values or intervals can be calculated using the PMF.

Calculating Probabilities:

  • $P(X = a)$: Probability that $X$ takes a specific value $a$. Use the PMF directly: $P(X = a) = p(a)$.
  • $P(a \le X \le b)$: Probability that $X$ lies between $a$ and $b$ (inclusive). Sum the probabilities for each value in the interval: $P(a \le X \le b) = \sum_{x=a}^{b} p(x)$.
  • $P(X > a)$: Probability that $X$ is greater than $a$. This can be calculated as \$1 - P(X \le a)$ or by summing the probabilities for all values greater than $a$.

Conditional Probability:

The conditional probability of event A given event B is defined as:

$$P(A|B) = \frac{P(A \cap B)}{P(B)}$$

where $P(A \cap B)$ is the probability of both A and B occurring, and $P(B)$ is the probability of B occurring.

Example:

Consider rolling a fair six-sided die. Let $X$ be the number rolled. Find the probability that $X$ is even given that $X$ is greater than 2.

$A$: $X$ is even (i.e., X = 2, 4, 6)
$B$: $X$ is greater than 2 (i.e., X = 3, 4, 5, 6)

$P(A \cap B)$: $X$ is even and greater than 2 (i.e., X = 4, 6). $P(A \cap B) = 2/6 = 1/3$
$P(B) = 4/6 = 2/3$

$P(A|B) = \frac{1/3}{2/3} = \frac{1}{2}$

Table of Contents