Expected Values

Definition

The expected value is a weighted average of all possible values.

Expected Value of a Discrete Random Variable

Definition

The expected value of a discrete random variable $X$ over a set $S$ is \[\mathrm{E}[X] = \sum\limits_{x \in S} x \mathrm{P}[X=x]\]

where is the probability that $X=x$.

Worked Example 1

Worked Example

What is the expected value for the number of heads obtained from $3$ coin tosses?

Solution

Let $X$ be the random variable denoting the number of heads that appear.

$\mathbf{x}$

$\mathbf{P[X=x]}$

$\mathbf{x \cdot P[X=x]}$

$\mathbf{0}$

$\dfrac{1}{8}$

$0$

$\mathbf{1}$

$\dfrac{3}{8}$

$\dfrac{3}{8}$

$\mathbf{2}$

$\dfrac{3}{8}$

$\dfrac{6}{8}$

$\mathbf{3}$

$\dfrac{1}{8}$

$\dfrac{3}{8}$

Note: $\cdot$ means multiply.

So the probability that a head will appear $0$ times is $\frac{1}{8}$ if a student tosses a coin $3$ times. To see why the probability of there being $1$ head is $\frac{3}{8}$ see here.

Now as the expected value is the weighted average of all possible values, we need to sum the right hand column. In this case the expected value is $0+ \frac{3}{8} + \frac{6}{8}+ \frac{3}{8}=\frac{3}{2}$.

So if you toss a coin $3$ times, you expect to get heads $1.5$ times (which makes perfect sense - the probability of getting heads for each coin toss is $0.5$). Note that we can't actually get heads $1.5$ times so the expected value does not tell us how many times we will get heads, but if we ran this experiment a large number of times and took the mean number of heads obtained, we expect it would be around $1.5$. This concept can be applied to many other examples, done in exactly the same way, so we can write a more formal definition.

Expected Value of a Continuous Random Variable

Definition

The expected value of a continuous random variable $X$ with probability density function $f(x)$ is \[\mathrm{E}[X] = \displaystyle\int\limits_{- \infty}^{\infty} x \cdot f(x) \mathrm{d} x.\] Note: $x\cdot f(x) $ must be integrable.

Worked Example 2

Worked Example

This time assume the student rolls a die three times and he wants to know how many sixes he should expect to roll.

Solution

We can either roll $0$, $1$, $2$ or $3$ sixes with $3$ rolls of the die so our set $S=\{ 0,1,2,3 \}$.

The probability of rolling $0$ sixes is equal to the probability of rolling something other than a six on each die, so there is a probability of $\frac{5}{6}$ on each die. \[\mathrm{P}[X=0] = \frac{5}{6} \times \frac{5}{6} \times \frac{5}{6} = \frac{5^3}{6^3} = \frac{125}{216}\text{.}\]

This time, we want the probability of rolling $1$ six, so change one of the probabilities from $\frac{5}{6}$ to $\frac{1}{6}$. But the six could be rolled on either the first, second or third throw, so there are 3 ways of rolling a six, so we must multiply this expression by 3. \[\mathrm{P}[X=1] = \frac{5}{6} \times \frac{5}{6} \times \frac{1}{6} \times 3 = \frac{25}{72} \text{.}\]

To roll $2$ sixes we again need to replace one of the $\frac{5}{6}$'s with a $\frac{1}{6}$. Again, we can roll something other than a six in three different ways (first, second or third throw) so we need to multiply this expression by 3. \[\mathrm{P}[X=2] = \frac{5}{6} \times \frac{1}{6} \times \frac{1}{6} \times 3 = \frac{5}{72} \text{.}\]

\[\mathrm{P}[X=3] = \frac{1}{6} \times \frac{1}{6} \times \frac{1}{6} = \frac{1}{216} \text{.}\]

Now substitute these into our formula

\begin{align} \mathrm{E}[X] &= \sum\limits_{x\in S} x \cdot \mathrm{P}[X=x] \\ &= \left(0 \times \mathrm{P}[X=0]\right) + \left(1 \times \mathrm{P}[X=1]\right) + \left(2 \times \mathrm{P}[X=2]\right) + \left(3 \times \mathrm{P}[X=3]\right) \\ &= \left( 0 \times \frac{125}{216}\right) + \left(1 \times \frac{25}{72}\right) + \left(2 \times \frac{5}{72}\right) + \left(3 \times \frac{1}{216}\right) = \frac{1}{2}. \end{align}

Worked Example 3

Worked Example

The continuous random variable $X$ is uniformly distributed if it has probability density function \[f(x) = \begin{cases} \dfrac{1}{b-a} & \text{if }a < x \leq b, \\[6pt] 0 & \text {otherwise }.\end{cases}\]

What is the expected value of a uniformly distributed random variable $X$?

Properties of the Expected Value

Let $X$ and $Y$ be random variables with finite expected values and $c$ be a constant. Then

\begin{align} \mathrm{E} [X+c] &= \mathrm{E} [X] + c \\ \mathrm{E}[X+Y] &= \mathrm{E}[X] + \mathrm{E}[Y] \\ \mathrm{E}[cX] &= c \mathrm{E}[X]. \end{align}

Let $g(x)$ be a function of a continuous random variable $X$ with probability distribution function $f(x)$. Then \[\mathrm{E} [g(x)] = \int\limits_{-\infty}^{\infty}g(x)f(x) \mathrm{d} x.\]

Note: $g(x) f(x) $ must be integrable

Video Examples
Example 1

This is a video showing a worked example on expected values produced by Alissa Grant-Walker.

Example 2

Daniel Organisciak works through the expected value of placing a bet on roulette.