This chapter discusses a certain type of large model for independent and identically distributed (i.i.d.) data, namely the popular mixture model based on stickbreaking processes. It then shows how this structure can be extended to cover non-i.i.d data, such as time series and regression models. These latter extensions require the calculation of a troublesome and unavoidable normalizing constant in order to do full Bayesian inference. Using a novel combination of latent models and Markov chain Monte Carlo (MCMC) techniques, the chapter shows that it is possible to provide satisfactorily complete Bayesian inference even in the non-i.i.d case.
Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
If you think you should have access to this title, please contact your librarian.