Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables.
A weak version of this result was first shown by Harald Cramér in 1938.
Statement
The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as:
![{\displaystyle \Lambda (t)=\log \operatorname {E} [\exp(tX_{1})].}](https://wikimedia.org/api/rest_v1/media/math/render/svg/143a2b3402497b075bf804b6053e753fc635291c)
Let
be a sequence of iid real random variables with finite logarithmic moment generating function, i.e.
for all
.
Then the Legendre transform of
:

satisfies,

for all
In the terminology of the theory of large deviations the result can be reformulated as follows:
If
is a series of iid random variables, then the distributions
satisfy a large deviation principle with rate function
.
References