Expectation and Variance

Learn how to compute the expectation and variance of random variables in both discrete and continuous cases, with clear definitions and examples.

ProbabilityBeginnerExpectationVariance

Two fundamental concepts in probability theory are expectation (mean) and variance.

  • The expectation represents the “average value” of a random variable.
  • The variance measures how much the values spread out around the expectation.

In this article, we explain how to compute them in both discrete and continuous settings, with simple examples.

Definition of Expectation

Discrete Case

If a random variable XX takes values xix_i with probabilities pip_i:

E[X]=ixipiE[X] = \sum_i x_i \, p_i

This is a weighted average.

Continuous Case

If XX has probability density function f(x)f(x):

E[X]=xf(x)dxE[X] = \int_{-\infty}^{\infty} x \, f(x) \, dx

This is the average with respect to the density.

Definition of Variance

Variance is defined as the expectation of the squared deviation from the mean:

V[X]=E[(XE[X])2]V[X] = E\big[(X - E[X])^2\big]

It can be rewritten in a more convenient form:

V[X]=E[X2](E[X])2V[X] = E[X^2] - (E[X])^2
  • E[X2]E[X^2]: the expectation of the square
  • (E[X])2(E[X])^2: the square of the expectation

Example 1: A Die (Discrete Uniform Distribution)

Let XX be the outcome of a fair six-sided die (1,2,3,4,5,61,2,3,4,5,6).

  • Expectation:
E[X]=1+2+3+4+5+66=3.5E[X] = \frac{1+2+3+4+5+6}{6} = 3.5
  • Variance:
E[X2]=12+22+32+42+52+626=916E[X^2] = \frac{1^2+2^2+3^2+4^2+5^2+6^2}{6} = \frac{91}{6}
V[X]=E[X2](E[X])2=916(3.5)2=35122.92V[X] = E[X^2] - (E[X])^2 = \frac{91}{6} - (3.5)^2 = \frac{35}{12} \approx 2.92

Example 2: Uniform Distribution on [0,1] (Continuous)

Let XX be uniformly distributed on [0,1].
Its density is f(x)=1f(x) = 1 for 0x10 \leq x \leq 1.

  • Expectation:
E[X]=01xdx=12E[X] = \int_0^1 x \, dx = \frac{1}{2}
  • Variance:
E[X2]=01x2dx=13E[X^2] = \int_0^1 x^2 \, dx = \frac{1}{3}
V[X]=E[X2](E[X])2=13(12)2=1120.083V[X] = E[X^2] - (E[X])^2 = \frac{1}{3} - \left(\frac{1}{2}\right)^2 = \frac{1}{12} \approx 0.083

Summary

  • Expectation is the average value of a random variable.
    • Discrete: E[X]=xipiE[X] = \sum x_i p_i
    • Continuous: E[X]=xf(x)dxE[X] = \int x f(x) dx
  • Variance measures the spread around the expectation.
    • V[X]=E[X2](E[X])2V[X] = E[X^2] - (E[X])^2
  • Example: Die → E[X]=3.5,  V[X]2.92E[X]=3.5, \; V[X]\approx 2.92
  • Example: Uniform[0,1] → E[X]=0.5,  V[X]0.083E[X]=0.5, \; V[X]\approx 0.083

Understanding expectation and variance deepens our grasp of how probability distributions are characterized. These quantities will keep appearing as we study specific distributions in more detail.

Interactive Expectation and Variance Demo

Fair Six-Sided Die

Calculations

Expectation:
E[X] = Σ x_i × p_i = (1+2+3+4+5+6)/6 = 3.50
E[X²]:
E[X²] = (1²+2²+3²+4²+5²+6²)/6 = 15.17
Variance:
V[X] = E[X²] - (E[X])² = 2.917
Standard Deviation:
σ = √V[X] = 1.708

Interpretation

  • • Each outcome has probability 1/6
  • • The red line shows the expected value (3.5)
  • • This is the "center of gravity" of the distribution
  • • Variance measures spread around the mean
  • • Higher variance = more spread out values

Key Insights

  • Discrete: Expectation is weighted average of possible values
  • Continuous: Expectation is integral of x × density
  • Variance: Always E[X²] - (E[X])² in both cases
  • Units: Variance has units²; standard deviation has same units as X
← Back to Encyclopedia