Mathematical expectation

In words...

The mathematical expectation (also known as the expectation, the expected value or the mean) of a random variable is the average value taken by a random variable over an infinite number of trials.

The expectation is a linear function, meaning for instance that the expectation of a sum is the sum of the expectations.

In picture...

Expected value of a die roll

We will compute an estimate of the expected value of a random variable recording the number of dots on the top face of a die by rolling the die an infinite number times and computing the average score.



Expected value: 3.5
Value observed for this trial:
Empirical average:
Number of rolls:
Estimation error:

XXX...idem for height of people and plot the "expected man"

In maths...

Expectation of a discrete random variable

The expectation of a discrete random variable $Y$ taking value in $\Y$ is $$ \E_Y [ Y ] = \sum_{y\in\Y} y \ P( Y = y ) , $$ i.e., it is the sum of all possible values taken by $Y$ weighted by their probabilities.

More generally, for any function $f : \Y \rightarrow \R$, we have $$ \E_Y [ f(Y) ] = \sum_{y\in\Y} f(y) \ P( Y = y ) . $$

Expectation of a continuous random variable

The expectation of a continuous random variable $X$ taking value in $\X$ is (if it exists) $$ \E_X [ X ] = \int_{\X} x\, p_X(x) dx, $$ where $p_X(x)$ is the probability density function of $X$.

More generally, for any function $f : \X \rightarrow \R$, we have $$ \E_X [ f(X) ] = \int_{\X} f(x)\, p_X(x) dx . $$

Linearity of the expectation

Given two random variables $X$ and $Y$ defined on the same probability space $(\Omega,\Sigma,P)$, we have, for any $\alpha \in \R$ and $\beta\in\R$, $$ \E [ \alpha X ] = \alpha \E [ X ] $$ and $$ \E [ X + Y ] = \E [ X ] + \E [ Y ], $$ which can be combined to yield $$ \E [ \alpha X + \beta Y ] = \alpha \E [ X ] + \beta \E [ Y ] . $$

Estimating the expectation

The expectation of a random variable can be estimated by computing the empirical average over a finite number of trials, which converges to the expectation: $$ P\left\{\lim_{n\rightarrow +\infty} \frac{1}{n} \sum_{i=1}^n X_i = \E X\right\} = 1 $$ where each $X_i$ is an independent copy of the random variable variable $X$.