3.6 Random Variables
3.6.1 Definition
A random variable (r.v.) is a transformation (function) of \(\Omega\) into \(\mathbb{R}^n\). This means that the results of random experiments will be transformed into numbers. Consider a random variable \(X\). \(R_{X}\) is the set of all possible values of \(X\), called codomain. It can be thought of as a “numerical sample space” obtained from \(\Omega\). A discrete random variable is such that \(R_{X}\) is countable.
Example 3.22 (Discrete random variable) Consider the sample space defined in Example 3.13. Suppose you are interested in the random variable ‘sum of points’, denoted by \(S\). The set of possible values of \(S\) is \(R_{S} = \left\lbrace 2,3, \ldots, 12 \right\rbrace\) and \(|R_{S}|=11\). \(\\\)
Exercise 3.12 Consider the Example 3.13. Define the codomains of the following random variables:
a. \(D\): ‘difference of points’.
b. \(P\): ‘product of points’.
c. \(Q\): ‘quotient of points’.
3.6.2 Probability distribution
Let \(X\) be a discrete random variable, where for each point of \(R_{X}\) a probability (mass function) \(p(x) = Pr(X=x)\) is associated, satisfying \(p(x) \ge 0\) for all \(x\) and \(\sum_{x \in R_{X}} p(x) = 1\).
Example 3.23 (Probability with discrete r.v.) Suppose two consecutive tosses of a balanced coin. The sample space is \(\Omega = \left\lbrace HH,HT,TH,TT \right\rbrace\), where \(H\) represents ‘heads’ and \(T\) ‘tails’. If we are interested in the random variable \(X\): ‘number of heads’, the set of interest becomes \(R_{X} = \left\lbrace 0,1,2 \right\rbrace\), where the element \(0\) of the set \(R_X\) is equivalent to event \(\left\lbrace TT \right\rbrace\), \(1\) to event \(\left\lbrace TH,HT \right\rbrace\) and \(2\) to \(\left\lbrace HH \right\rbrace\). The probabilities are \[ p(0) = Pr(X=0)=Pr(\left\lbrace TT \right\rbrace) = \dfrac{1}{2} \times \dfrac{1}{2} =\dfrac{1}{4}, \] \[ p(1) = Pr(X=1)=Pr(\left\lbrace TH,HT \right\rbrace)= \left( \dfrac{1}{2} \times \dfrac{1}{2} \right) + \left( \dfrac{1}{2} \times \dfrac{1}{2} \right) = \dfrac{2}{4} = \dfrac{1}{2}, \] \[ p(2) = Pr(X=2)=Pr(\left\lbrace HH \right\rbrace)=\dfrac{1}{2} \times \dfrac{1}{2} =\dfrac{1}{4}. \] Note que \(Pr(X=0)+Pr(X=1)+Pr(X=2)=\dfrac{1}{4}+\dfrac{2}{4}+\dfrac{1}{4}=1\). \(\\\)
Exercise 3.13 Get the probability distributions:
a. From the Example 3.22.
b. From item a. of Exercise 3.12.
c. From item b. of Exercise 3.12.
d. From item c. of Exercise 3.12.
\(\\\)
Exercise 3.14 Consider the Example 3.23.
a. Redo for three throws.
b. Redo for four throws.
c. Redo for \(n\) throws.
3.6.3 Expected value
The expected value of a discrete random variable \(X\) is given by
\[\begin{equation} E\left[ X \right] = \sum_{x} x \cdot p(x) \tag{3.44} \end{equation}\]
the expected value of a function \(g(X)\) is given by
\[\begin{equation} E\left[ g(X) \right] = \sum_{x} g(x) \cdot p(x) \tag{3.45} \end{equation}\]
Example 3.24 (Expected value of discrete r.v. \(X\) and \(X^2\)) From the Example 3.23 one can calculate \[ E(X) = 0 \times \dfrac{1}{4} + 1 \times \dfrac{2}{4} + 2 \times \dfrac{1}{4} = 1. \] This result was expected given the symmetry of the distribution. The expectation of \(g(X) = X^2\) is given by \[ E(X^2) = 0^2 \times \dfrac{1}{4} + 1^2 \times \dfrac{2}{ 4} + 2^2 \times \dfrac{1}{4} = \dfrac{3}{2} = 1.5. \]
3.6.4 Variance and standard deviation
The variance of a discrete random variable \(X\) is given by
\[\begin{equation} V(X) = E( \left[ X - E(X) \right] ^2) = E(X^2) - \left[ E(X) \right] ^2 \tag{3.46} \end{equation}\]
The standard deviation of a discrete random variable \(X\) is given by
\[\begin{equation} D(X) = \sqrt{V(X)} \tag{3.47} \end{equation}\]
Example 3.25 (Variance and standard deviation of a discrete r.v.) From the Example 3.24 it can be calculated \[ V(X) = \dfrac{3}{2} - 1^2 = \dfrac{1}{2} = 0.5\] \[ D(X) = \sqrt{0.5} \approx 0.7071 \]