# Expectation

The expected value, or mean, or first moment, of X is defined to be

assuming that the sum (or integral) is well-defined. We use the following notion to denote the expected value of X:

E(X) = EX = ∫ x dF(x) = μ = μ_{x}

To understand expectation, think of it as average value obtained if we calculate the numerical average of large number of IID draws X_{1}, X_{2}, . . . X_{n}. Note that the expectation only exists when ∫ |x|dF_{X}(x) < ∞

**Example:** Flip a fair coin twice and let X be the number of heads. Then,

E(X) = ∫ xdF_{X}(x) = ∑ xf_{X}(x) = (0 x f(0)) + (1 x f(1)) + (2 x f(2))

= (0 x (1/4)) + (1 x (1/2)) + (2 x (1/4)) = 1

### The Rule of Lazy Statistician

Let Y = r(X). Then

E(Y) = E( r(X) ) = ∫ r(x) dF_{X}(x)

**Example:** Take a stick of unit length and break it at random. Let Y be the length of the longer piece. What is the mean of Y?

If X is the breaking point then X ~ Unif(0, 1) and Y = r(X) = max{X, 1-X}.

Thus r(x) = 1 – x when 0 < x < 1/2 and r(x) = x when 1/2 < x < 1.

Now we get r(x) as values for different ranges. Just integrate and sum the values for ranges. The result will be 3/4. Try it yourself.

### Properties of Expectation

- If X
_{1}, X_{2}, . . . X_{n }are random variables and a_{1}, a_{2}, . . . a_{n }are constants then

E (∑a_{i}X_{i}) = ∑a_{i}E(X_{i})

- Let X
_{1}, X_{2}, . . . X_{n }are independent random variables. Then,

Note that the summation rule doesn’t require independence but the multiplication rule does.