If Z is a normal random variable with parameters (μ = m, σ2 = s2), then X = aZ + b is a normal random variable with parameters (μ = am + b, σ2 = a2s2).
If it has a distribution from the same family of distributions as the original variables, that family of distributions is said to be closed under convolution. Often (always?) these distributions are also stable distributions (see also Discrete-stable distribution).
If X1 and X2 are Poisson random variables with means μ1 and μ2 respectively, then X1 + X2 is a Poisson random variable with mean μ1 + μ2.
The sum of gamma (αi, β) random variables has a gamma (Σαi, β) distribution.
If X1 is a Cauchy (μ1, σ1) random variable and X2 is a Cauchy (μ2, σ2), then X1 + X2 is a Cauchy (μ1 + μ2, σ1 + σ2) random variable.
If X1 and X2 are chi-squared random variables with ν1 and ν2 degrees of freedom respectively, then X1 + X2 is a chi-squared random variable with ν1 + ν2 degrees of freedom.
If X1 is a normal (μ1, σ2 1) random variable and X2 is a normal (μ2, σ2 2) random variable, then X1 + X2 is a normal (μ1 + μ2, σ2 1 + σ2 2) random variable.
The sum of N chi-squared (1) random variables has a chi-squared distribution with N degrees of freedom.
Other distributions are not closed under convolution, but their sum has a known distribution:
The sum of nBernoulli (p) random variables is a binomial (n, p) random variable.
The sum of ngeometric random variables with probability of success p is a negative binomial random variable with parameters n and p.
The sum of nexponential (β) random variables is a gamma (n, β) random variable. Since n is an integer, the gamma distribution is also a Erlang distribution.
The sum of the squares of Nstandard normal random variables has a chi-squared distribution with N degrees of freedom.
If X1 and X2 are independent log-normal random variables with parameters (μ1, σ2 1) and (μ2, σ2 2) respectively, then X1X2 is a log-normal random variable with parameters (μ1 + μ2, σ2 1 + σ2 2).
If X1 and X2 are independent geometric random variables with probability of success p1 and p2 respectively, then min(X1, X2) is a geometric random variable with probability of success p = p1 + p2 − p1p2. The relationship is simpler if expressed in terms probability of failure: q = q1q2.
If X1 and X2 are independent exponential random variables with rate μ1 and μ2 respectively, then min(X1, X2) is an exponential random variable with rate μ = μ1 + μ2.
Similarly, distributions for which the maximum value of several independent random variables is a member of the same family of distribution include:
Bernoulli distribution, Power law distribution.
Other
If X and Y are independent standard normal random variables, X/Y is a Cauchy (0,1) random variable.
If X1 and X2 are independent chi-squared random variables with ν1 and ν2 degrees of freedom respectively, then (X1/ν1)/(X2/ν2) is an F(ν1, ν2) random variable.
If X is a standard normal random variable and U is an independent chi-squared random variable with ν degrees of freedom, then is a Student's t(ν) random variable.
If X1 is a gamma (α1, 1) random variable and X2 is an independent gamma (α2, 1) random variable then X1/(X1 + X2) is a beta(α1, α2) random variable. More generally, if X1 is a gamma(α1, β1) random variable and X2 is an independent gamma(α2, β2) random variable then β2 X1/(β2X1 + β1X2) is a beta(α1, α2) random variable.
If X and Y are independent exponential random variables with mean μ, then X − Y is a double exponential random variable with mean 0 and scale μ.
If Xi are independent Bernoulli random variables then their parity (XOR) is a Bernoulli variable described by the piling-up lemma.
either that the combination of an infinite number of iid random variables tends to some distribution,
or that the limit when a parameter tends to some value approaches to a different distribution.
Combination of iid random variables:
Given certain conditions, the sum (hence the average) of a sufficiently large number of iid random variables, each with finite mean and variance, will be approximately normally distributed. This is the central limit theorem (CLT).
Special case of distribution parametrization:
X is a hypergeometric (m, N, n) random variable. If n and m are large compared to N, and p = m/N is not close to 0 or 1, then X approximately has a Binomial(n, p) distribution.
X is a beta-binomial random variable with parameters (n, α, β). Let p = α/(α + β) and suppose α + β is large, then X approximately has a binomial(n, p) distribution.
If X is a binomial (n, p) random variable and if n is large and np is small then X approximately has a Poisson(np) distribution.
If X is a negative binomial random variable with r large, P near 1, and r(1 − P) = λ, then X approximately has a Poisson distribution with mean λ.
Consequences of the CLT:
If X is a Poisson random variable with large mean, then for integers j and k, P(j ≤ X ≤ k) approximately equals to P(j − 1/2 ≤ Y ≤ k + 1/2) where Y is a normal distribution with the same mean and variance as X.
If X is a binomial(n, p) random variable with large np and n(1 − p), then for integers j and k, P(j ≤ X ≤ k) approximately equals to P(j − 1/2 ≤ Y ≤ k + 1/2) where Y is a normal random variable with the same mean and variance as X, i.e. np and np(1 − p).
If X is a beta random variable with parameters α and β equal and large, then X approximately has a normal distribution with the same mean and variance, i. e. mean α/(α + β) and variance αβ/((α + β)2(α + β + 1)).
If X is a gamma(α, β) random variable and the shape parameter α is large relative to the scale parameter β, then X approximately has a normal random variable with the same mean and variance.
If X is a Student's t random variable with a large number of degrees of freedom ν then X approximately has a standard normal distribution.
If X is an F(ν, ω) random variable with ω large, then νX is approximately distributed as a chi-squared random variable with ν degrees of freedom.
Compound (or Bayesian) relationships
When one or more parameter(s) of a distribution are random variables, the compound distribution is the marginal distribution of the variable.
Examples:
If X | N is a binomial (N,p) random variable, where parameter N is a random variable with negative-binomial (m, r) distribution, then X is distributed as a negative-binomial (m, r/(p + qr)).
If X | N is a binomial (N,p) random variable, where parameter N is a random variable with Poisson(μ) distribution, then X is distributed as a Poisson (μp).
If X | μ is a Poisson(μ) random variable and parameter μ is random variable with gamma(m, θ) distribution (where θ is the scale parameter), then X is distributed as a negative-binomial (m, θ/(1 + θ)), sometimes called gamma-Poisson distribution.
If X is a Binomial(n,p) random variable, and parameter p is a random variable with beta(α, β) distribution, then X is distributed as a Beta-Binomial(α,β,n).
If X is a negative-binomial(r,p) random variable, and parameter p is a random variable with beta(α,β) distribution, then X is distributed as a Beta negative binomial distribution(r,α,β).