Notes on Probability and Distribution
Here, I take some notes on probability and distribution for the convenience of my future reference.
Dicrete Random Variables
1. Uniform Distribution
- pmf: $P(x) = \frac{1}{b-a+1}$ for x = a, a + 1,…,b
- E[$X$]: $\frac{a+b}{2}$
- var($X$): $\frac{(b-a+1)^2 - 1}{12}$
- meaning: it represents the occurrence of a value between number $a$ and $b$ when all values in the set {a, a+1,…,b} have equal probability.
2. Binomial Distribution
- pmf: $P(x) = {n \choose x} p^{x} (1-p)^{n-x}$ for x = 0, 1,…,n
- E[$X$]: $np$
- var($X$): $np(1-p)$
- meaning: it represents the number of successes in a sequence of $n$ experiments when each trial is independently a success with probability $p$.
3. Poisson Distribution
- pmf: $P(x) = \frac{e^{-\lambda t}(\lambda t)^{x}}{x!}$ for x = 0, 1,…,n
- E[$X$]: $\lambda t$
- var($X$): $\lambda t$
- meaning: it represents the number of events occurring in a fixed period of time with the expected number of occurrences $\lambda t$ when events occur with a known average rate $\lambda$.
4. Geometric Distribution
- pmf: $P(x) = (1-p)^{x-1} p$ for x = 1,2,…,n
- E[$X$]: $\frac{1}{p}$
- var($X$): $\frac{1-p}{p^2}$
- meaning: it represents the trial number to get the first success when each trial is independently a success with probability $p$.
5. Negative Binomial Distribution
- pmf: $P(x) = {x-1 \choose r-1} p^{r} (1-p)^{x-r}$ for x = r, r+1,…
- E[$X$]: $\frac{r}{p}$
- var($X$): $\frac{r(1-p)}{p^2}$
- meaning: it represents the trial number to get the r-th success when each trial is independently a success with probability $p$.
Continuous Random Variables
1. Uniform Distribution
- pdf: $P(x) = \frac{1}{b-a}$ for $a \leq x \leq b$
- E[$X$]: $\frac{a+b}{2}$
- var($X$): $\frac{(b-a+1)^2 - 1}{12}$
- meaning: it represents a random variable uniformly distributed over the interval [a, b]
2. Normal Distribution
- pdf: $P(x) = \frac{1}{\sqrt{2\pi} \sigma} \cdot e^{\frac{-(x-\mu)^2}{2\sigma^2}}$ for $a \leq x \leq b$
- E[$X$]: $\mu$
- var($X$): $\sigma^2$
3. Exponential Distribution
- pdf: $P(x) = \lambda e^{-\lambda x}$ for $x \geq 0$
- E[$X$]: $\frac{1}{\lambda}$
- var($X$): $\frac{1}{\lambda^2}$
- meaning: it represents the arrival time of an event if it has a constant arrival rate $\lambda$.
- property: memorylessness: $P(\tau > s+t | \tau >s) = P(\tau > t)$, which means if we have waited for $s$ time unites, the extra waiting time has the same distribution as the waiting time when we start at time 0.
4. Gamma Distribution
- pdf: $P(x) = \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\Gamma(\alpha)}$ for $x \geq 0$, $\Gamma(\alpha) = \int_{0}^{\infty} e^{-y} y^{\alpha -1}$
- E[$X$]: $\frac{\alpha}{\lambda}$
- var($X$): $\frac{\alpha}{\lambda^2}$
- meaning: it represents the amount of time one has to wait until a total of $n$ events occur.