Expectation

The expected value, or mean, or first moment, of X is defined to be
expectation
assuming that the sum (or integral) is well-defined. We use the following notion to denote the expected value of X:
E(X) = EX = ∫ x dF(x) = μ = μx

To understand expectation, think of it as average value obtained if we calculate the numerical average of large number of IID draws X1, X2, . . . Xn. Note that the expectation only exists when ∫ |x|dFX(x) < ∞

Example: Flip a fair coin twice and let X be the number of heads. Then,
E(X) = ∫ xdFX(x) = ∑ xfX(x) = (0 x f(0)) + (1 x f(1)) + (2 x f(2))
        = (0 x (1/4)) + (1 x (1/2)) + (2 x (1/4)) = 1


The Rule of Lazy Statistician

Let Y = r(X). Then
E(Y) = E( r(X) ) = ∫ r(x) dFX(x)

Example: Take a stick of unit length and break it at random. Let Y be the length of the longer piece. What is the mean of Y?
If X is the breaking point then X ~ Unif(0, 1) and Y = r(X) = max{X, 1-X}.
Thus r(x) = 1 – x when 0 < x < 1/2 and r(x) = x when 1/2 < x < 1.
Now we get r(x) as values for different ranges. Just integrate and sum the values for ranges. The result will be 3/4. Try it yourself.


Properties of Expectation

  • If X1, X2, . . . Xare random variables and a1, a2, . . . aare constants then

E (∑aiXi) = ∑aiE(Xi)

  • Let X1, X2, . . . Xn  are independent random variables. Then,

Expectation Note that the summation rule doesn’t require independence but the multiplication rule does.

Leave A Comment