Problem
Suppose we have a sequence of i.i.d. Gaussian random variables
Sub-gaussian random variables
To make this problem more realistic, it is always safe to loose Gaussian random variables to zero mean sub-gaussian random variables.
Intuitively, sub-gaussian r.v.‘s have tails decreasing as fast as those of Gaussian r.v.‘s. Hence, most of the inequalities related to Gaussian r.v.‘s can be safely applied to sub-gaussians without any modifications! You are suggested to refer to Subgaussian random variables: An expository note
for more details. Here, we assume
An elementary property of
Also, it is useful to know the following facts:
is -subgaussian,- and
.
A naive way
Obviously, we can use a union bound to give a naive bound.
For each
from which we can see the upper bound is no less then
This bound is very loose. Later, you will see the reason.
An alternative way
Doob’s martingale inequality
The formal statement of Doob’s martingale inequality can be found in 1. We restate it in the following.
Suppose the sequence
With this tool in mind, we are now ready to bound
Using standard Chernoff’s method, for any
Since
where the second equality is due to the mutual indenpendency of
The minimum is achieved when
which is only one
Note that Lemma 2 in 2 gives the same statement. However, I think it is more direct to use Doob’s martingale inequality.
-
Doob’s martingale inequality , Wikipedia. ↩︎
-
Shengjia Zhao, Enze Zhou, Ashish Sabharwal, and Stefano Ermon. Adaptive Concentration Inequalities for Sequential Decision Problems . In NIPS, pages 1343–1351, 2016. ↩︎