Asymptotic properties of predictive recursion: Robustness and rate of convergence
Here we explore general asymptotic properties of Predictive Recursion (PR) for nonparametric estimation of mixing distributions. We prove that, when the mixture model is mis-specified, the estimated mixture converges almost surely in total variation to the mixture that minimizes the Kullback-Leibler divergence, and a bound on the (Hellinger contrast) rate of convergence is obtained. Simulations suggest that this rate is nearly sharp in a minimax sense. Moreover, when the model is identifiable, almost sure weak convergence of the mixing distribution estimate follows. PR assumes that the support of the mixing distribution is known. To remove this requirement, we propose a generalization that incorporates a sequence of supports, increasing with the sample size, that combines the efficiency of PR with the flexibility ofmixture sieves. Undermild conditions, we obtain a bound on the rate of convergence of these new estimates. © 2009, Institute of Mathematical Statistics. All rights reserved.
Duke Scholars
Published In
DOI
ISSN
Publication Date
Volume
Start / End Page
Related Subject Headings
- 4905 Statistics
- 0104 Statistics
Citation
Published In
DOI
ISSN
Publication Date
Volume
Start / End Page
Related Subject Headings
- 4905 Statistics
- 0104 Statistics