Large deviations

Here is a nice theorem about Gaussian tails. Suppose Y_n = \sum_{i=1}^n X_{i,n} where X_{i,n} are mutually independent random variables with mean \mathrm{E}[X_{i,n}] = 0 and variance \mathrm{Var}[X_{i,n}] = \sigma_{i,n}^2. Denote \mathrm{Var}[Y_n] = \sigma_n^2 = \sum_{i=1}^n \sigma_{i,n}^2. We consider the probability \mathrm{P}[Y_n \geq a_n \sigma_n] as \lim_{n\rightarrow \infty} \lambda_n \equiv \frac{a_n}{\sigma_n} = 0, that is, \sigma_n goes to infinity faster than a_n. Assume further that:

\mathrm{E}[\exp(\lambda_n X_{i,n})] \leq \exp((1+o(1)) \lambda_n^2 \sigma_{i,n}^2 / 2)

A simple sufficient condition for this last equation to hold is that the X_{i,n} be bounded, i.e. that there be a constant K such that |X_{i,n}|\leq K for all i and n. Then the following theorem holds (the proof is elementary and can be found in this book):

Theorem: \mathrm{P}[Y_n \geq a_n \sigma_n] \leq \exp(- (1+o(1))a_n^2 / 2)

This is exactly the tail behavior of a Gaussian random variable. Hence, this theorem shows that under somewhat general conditions, tails of sums of independent random variables asymptotically behave like Gaussian tails.

Advertisements