Abstract
A number of generalizations of the Kolmogorov strong law of large numbers are known including convex combinations of random variables (rvs) with random coefficients. In the case of pairs of i.i.d. rvs (X-1, Y-1), ..., (X-n, Y-n), with mu being the probability distribution of the xs, the averages of the Ys for which the accompanying Xs are in a vicinity of a given point x may converge with probability 1 (w.p. 1) and for mu-almost everywhere (mu a.e.) x to conditional expectation r(x) = E(Y\X = x). We consider the Nadaraya-Watson estimator of E(Y\X = x) where the vicinities of x are determined by window widths h(n). Its convergence towards r(x) w.p. 1 and for mu a.e. x under the condition E\Y\ <infinity is called a strong law of large numbers for conditional expectations (SLLNCE). If no other assumptions on mu except that implied by E\Y\ <infinity are required then the SLLNCE is called universal. In the present paper we investigate the minimal assumptions for the SLLNCE and for the universal SLLNCE. We improve the best-known results in this direction.
Original language | English |
---|---|
Pages (from-to) | 143-165 |
Number of pages | 23 |
Journal | Bernoulli |
Volume | 4 |
Issue number | 2 |
Publication status | Published - Jun 1998 |
Keywords
- conditional expectation
- kernel estimator
- Nadaraya-Watson estimator
- nonparametric regression
- strong convergence
- strong law of large numbers
- universal convergence