This page pairs well with Probability.
Table of Distributions
| Distribution | mass/density function | \[S_X\] | \[\mathbb{E}(X)\] | \[\mathrm{Var}(X)\] | \[\phi_X(s)\] |
|---|---|---|---|---|---|
| Bernoulli \[\mathrm{Bern}(\pi)\] | \[P(X=1)=\pi\\ P(X=0)=1-\pi\] | \[\{0,1\}\] | \[\pi\] | \[\pi(1-\pi)\] | \[(1-\pi)+\pi e^{s}\] |
| Binomial \[\mathrm{Bin}(n,\pi)\] | \[p_X(x)=\binom{n}{x}\pi^{x}(1-\pi)^{n-x}\] | \[\{0,1,\dots,n\}\] | \[n\pi\] | \[n\pi(1-\pi)\] | \[(1-\pi+\pi e^{s})^{n}\] |
| Geometric \[\mathrm{Geo}(\pi)\] | \[p_X(x)=\pi(1-\pi)^{x-1}\] | \[\{1,2,\dots\}\] | \[\pi^{-1}\] | \[(1-\pi)\pi^{-2}\] | \[\frac{\pi}{e^{-s}-1+\pi}\] |
| Poisson \[\mathcal{P}(\lambda)\] | \[p_X(x)=e^{-\lambda}\lambda^{x}/x!\] | \[\{0,1,\dots\}\] | \[\lambda\] | \[\lambda\] | \[\exp\{\lambda(e^{s}-1)\}\] |
| Uniform \[U[\alpha,\beta]\] | \[f_X(x)=(\beta-\alpha)^{-1}\] | \[[\alpha,\beta]\] | \[\frac{1}{2}(\alpha+\beta)\] | \[\frac{1}{12}(\beta-\alpha)^2\] | \[\frac{e^{\beta s}-e^{\alpha s}}{s(\beta-\alpha)}\] |
| Exponential \[\mathrm{Exp}(\lambda)\] | \[f_X(x)=\lambda e^{-\lambda x}\] | \[[0,\infty)\] | \[\lambda^{-1}\] | \[\lambda^{-2}\] | \[\frac{\lambda}{\lambda-s}\] |
| Gaussian \[\mathcal{N}(\mu,\sigma^{2})\] | \[f_X(x)=\frac{1}{\sqrt{2\pi\sigma^{2}}}\exp\left\{-\frac{(x-\mu)^2}{2\sigma^{2}}\right\}\] | \[\mathbb{R}\] | \[\mu\] | \[\sigma^{2}\] | \[e^{\mu s+\frac{1}{2}\sigma^{2}s^{2}}\] |
| Gamma \[\Gamma(\alpha,\lambda)\] | \[f_X(x)=\frac{1}{\Gamma(\alpha)}\lambda^{\alpha}x^{\alpha-1}e^{-\lambda x}\] | \[[0,\infty)\] | \[\alpha\lambda^{-1}\] | \[\alpha\lambda^{-2}\] | \[\left(\frac{\lambda}{\lambda-s}\right)^{\alpha}\] |
Statistical Inference
Definition
(Random Sample & Model)
Let \(X=(X_1,\ldots,X_n)\) be i.i.d. from a parametric family \(\{F_\theta:\theta\in\Theta\subset\mathbb{R}^p\}\). The parameter \(\theta\) is unknown; inference uses the randomness of \(X\) to learn about \(\theta\).
Backlinks (2)
1. Wiki /wiki/
Knowledge is a paradox. The more one understand, the more one realises the vastness of his ignorance.