Convergence and consistency of Newton’s algorithm for estimating mixing distribution

Published

Book Section

© 2006 by Imperial College Press. All rights reserved. We provide a new convergence and consistency proof of Newton’s algorithm for estimating a mixing distribution under some rather strong conditions. An auxiliary result used in the proof shows that the Kullback Leibler divergence between the estimate and the true mixing distribution converges as the number of observations tends to infinity. This holds under much weaker conditions. It is pointed out that Newton’s proof of convergence, based on a representation of the algorithm as a nonhomogeneous weakly ergodic Markov chain, is incomplete. Our proof is along quite different lines. We also study various other aspects of the estimate, including its claimed superiority to the Bayes estimate based on a Dirichlet mixture.

Full Text

Duke Authors

Cited Authors

  • Ghosh, JK; Tokdar, ST

Published Date

  • January 1, 2006

Book Title

  • Frontiers in Statistics: Dedicated to Peter John Bickel in Honor of his 65th Birthday

Start / End Page

  • 429 - 443

International Standard Book Number 10 (ISBN-10)

  • 1860946704

International Standard Book Number 13 (ISBN-13)

  • 9781860946707

Digital Object Identifier (DOI)

  • 10.1142/9781860948886_0019

Citation Source

  • Scopus