1 Convergence of Empirical Distribution
Suppose , where is an unknown c.d.f. We want to estimate .
A natural estimator is the empirical distribution : where for ,
- .
- Note that
So by SLLN, , . I.e. ,
If we expand the limit claim, , s.t. , . Here depends on , so is pointwise convergence.
One can obtain a stronger result:
Theorem (Glivenko-Cantelli)
Suppose . Then In other words
If we also expand it, , s.t. , . Here does not depend on , so is uniform convergence.
The following proof and discussions are inserted after later notes. Readers can skip this part for now.
Define
So this theorem is equivalent to . However today we only prove a weaker version: .
continuous. The distribution of is the same for all continuous .
Recall the quantile function
We have shown this result: , , continuous. Then and . Then
But , which proves the claim.
Let . Recall order statistics .
. occurs at either or for some . . So
By this result, . So
For every , here note that
by Markov's inequality. So .
Furthermore, and for every ,
so .
2 Relation to Brownian Bridge Kernel
Recall Multivariate CLT:
where with
This is true for all . So the RHS corresponds to a Gaussian process with the Brownian Bridge Kernel .
Hence
We have another fact about Brownian bridge kernel:
Theorem (Kolomogorov-Smirnov)
The first term alone is very accurate.
So if is large,
This can be used to find an asymptotic level confidence interval for estimating simultaneously for all .