Expectation

The expected value, or mean, or first moment, of X is defined to be
expectation
assuming that the sum (or integral) is well-defined. We use the following notion to denote the expected value of X:
E(X) = EX = ∫ x dF(x) = μ = μx

For uniform probability distribution, the expectation is simply the arithmetic mean of outcomes, since every outcome has an equal probability. Let us take the example of rolling an unbiased dice. It can have 6 outcomes {1, 2, 3, 4, 5, 6} each with equal probability. Therefore the expectation will be 3.5(arithmetic mean). To understand expectation intuitively, consider the dice rolled n times. As n increases, the average will almost surely converge to the expected value, a fact know as strong law of large numbers.

To understand the expectation, think of it as the average value obtained if we calculate the numerical average of a large number of IID draws X1, X2, . . . Xn. Note that the expectation only exists when ∫ |x|dFX(x) < ∞. The expectation is also known as mean or first moment.

Example: Flip a fair coin twice and let X be the number of heads. Then,
E(X) = ∫ xdFX(x) = ∑ xfX(x) = (0 x f(0)) + (1 x f(1)) + (2 x f(2))
        = (0 x (1/4)) + (1 x (1/2)) + (2 x (1/4)) = 1


The Rule of Lazy Statistician

Let Y = r(X). Then
E(Y) = E( r(X) ) = ∫ r(x) dFX(x)

Example: Take a stick of unit length and break it at random. Let Y be the length of the longer piece. What is the mean of Y?
If X is the breaking point then X ~ Unif(0, 1) and Y = r(X) = max{X, 1-X}.
Thus r(x) = 1 – x when 0 < x < 1/2 and r(x) = x when 1/2 < x < 1.
Now we get r(x) as values for different ranges. Just integrate and sum the values for ranges. The result will be 3/4. Try it yourself.


Properties of Expectation

  • If X1, X2, . . . Xare random variables and a1, a2, . . . aare constants then

E (∑aiXi) = ∑aiE(Xi)

  • Let X1, X2, . . . Xn  are independent random variables. Then,

Expectation Note that the summation rule doesn’t require independence but the multiplication rule does.

Leave A Comment