Asymptotic properties of predictive recursion: Robustness and rate of convergence

Journal Article (Journal Article)

Here we explore general asymptotic properties of Predictive Recursion (PR) for nonparametric estimation of mixing distributions. We prove that, when the mixture model is mis-specified, the estimated mixture converges almost surely in total variation to the mixture that minimizes the Kullback-Leibler divergence, and a bound on the (Hellinger contrast) rate of convergence is obtained. Simulations suggest that this rate is nearly sharp in a minimax sense. Moreover, when the model is identifiable, almost sure weak convergence of the mixing distribution estimate follows. PR assumes that the support of the mixing distribution is known. To remove this requirement, we propose a generalization that incorporates a sequence of supports, increasing with the sample size, that combines the efficiency of PR with the flexibility ofmixture sieves. Undermild conditions, we obtain a bound on the rate of convergence of these new estimates. © 2009, Institute of Mathematical Statistics. All rights reserved.

Full Text

Duke Authors

Cited Authors

  • Martin, R; Tokdar, ST

Published Date

  • January 1, 2009

Published In

Volume / Issue

  • 3 /

Start / End Page

  • 1455 - 1472

International Standard Serial Number (ISSN)

  • 1935-7524

Digital Object Identifier (DOI)

  • 10.1214/09-EJS458

Citation Source

  • Scopus