This book focuses on the theory and applications of discrete-time two-time-scale Markov chains. Much effort in this book is devoted to designing system models arising from these applications, analyzing them via analytic and probabilistic techniques, and developing feasible computational algorithms so as to reduce the inherent complexity. This book presents results including asymptotic expansions of probability vectors, structural properties of occupation measures, exponential bounds, aggregation and decomposition and associated limit processes, and interface of discrete-time and continuous-time systems. One of the salient features is that it contains a diverse range of applications on filtering, estimation, control, optimization, and Markov decision processes, and financial engineering. This book will be an important reference for researchers in the areas of applied probability, control theory, operations research, as well as for practitioners who use optimization techniques. Part of the book can also be used in a graduate course of applied probability, stochastic processes, and applications.
From the reviews: "Discrete-time Markov chains are the basic building blocks for understanding random dynamic phenomena, in preparation for more complex situations. ... the book is a research monograph based largely on the author's own work. ... The book does ... fill an important niche in the literature on singularly perturbed Markov chains. ... the book will be useful to applied probabilities and engineers who deal with such systems. Other than this, the book's primary audience is other researchers in singulary perturbed Markov chains." (IEEE Control Systems Magazine, December, 2005)