This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and Polya tree and their extensions form a separate chapter, while the last two chapters present the Bayesian solutions to certain estimation problems pertaining to the distribution function and its functional based on complete data as well as right censored data. Because of the conjugacy property of some of these processes, most solutions are presented in closed form.
However, the current interest in modeling and treating large-scale and complex data also poses a problem – the posterior distribution, which is essential to Bayesian analysis, is invariably not in a closed form, making it necessary to resort to simulation. Accordingly, the book also introduces several computational procedures, such as the Gibbs sampler, Blocked Gibbs sampler and slice sampling, highlighting essential steps of algorithms while discussing specific models. In addition, it features crucial steps of proofs and derivations, explains the relationships between different processes and provides further clarifications to promote a deeper understanding. Lastly, it includes a comprehensive list of references, equipping readers to explore further on their own.
However, the current interest in modeling and treating large-scale and complex data also poses a problem – the posterior distribution, which is essential to Bayesian analysis, is invariably not in a closed form, making it necessary to resort to simulation. Accordingly, the book also introduces several computational procedures, such as the Gibbs sampler, Blocked Gibbs sampler and slice sampling, highlighting essential steps of algorithms while discussing specific models. In addition, it features crucial steps of proofs and derivations, explains the relationships between different processes and provides further clarifications to promote a deeper understanding. Lastly, it includes a comprehensive list of references, equipping readers to explore further on their own.