Discrete Distributions
An introduction to key discrete probability distributions: the uniform, Bernoulli, and binomial distributions, with examples, expectations, and variances.
So far, we have seen that a probability distribution describes the relationship between the values a random variable can take and the probabilities assigned to them.
In this article, we focus on three fundamental discrete distributions:
the discrete uniform distribution, the Bernoulli distribution, and the binomial distribution.
For each, we will look at its definition, an example, expectation, and variance.
Discrete Uniform Distribution
Definition
If a random variable takes values in with equal probability:
Example: A fair die
Rolling a six-sided die corresponds to a uniform distribution with :
Expectation and Variance
- Expectation:
- Variance:
For a die ():
Bernoulli Distribution
Definition
If a random variable represents a “success” (1) with probability and a “failure” (0) with probability :
Example: A coin toss
Consider tossing a coin once, with heads as “success.”
If :
Expectation and Variance
- Expectation:
- Variance:
For a fair coin ():
Binomial Distribution
Definition
The binomial distribution describes the number of successes in independent Bernoulli trials, each with success probability .
Example: 10 coin tosses
If we toss a fair coin () 10 times, the probability of getting heads is:
Expectation and Variance
- Expectation:
- Variance:
For :
Summary
- Discrete Uniform Distribution: All outcomes equally likely (e.g., a die).
- Bernoulli Distribution: Two outcomes, success (1) or failure (0) (e.g., a coin toss).
- Binomial Distribution: The number of successes in independent Bernoulli trials.
These three are the basic building blocks of discrete probability distributions and form the foundation for learning more advanced distributions.
Interactive Discrete Distributions Demo
Uniform Distribution
Statistics
Formulas:
About This Distribution
The discrete uniform distribution assigns equal probability to each of n possible outcomes. Like rolling a fair die, each outcome is equally likely. As you increase n, the variance increases but each individual probability decreases.