by Emin Orhan
Here is a nice theorem about Gaussian tails. Suppose where are mutually independent random variables with mean and variance . Denote . We consider the probability as , that is, goes to infinity faster than . Assume further that:
A simple sufficient condition for this last equation to hold is that the be bounded, i.e. that there be a constant such that for all and . Then the following theorem holds (the proof is elementary and can be found in this book):
This is exactly the tail behavior of a Gaussian random variable. Hence, this theorem shows that under somewhat general conditions, tails of sums of independent random variables asymptotically behave like Gaussian tails.