Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest. The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the…mehr
Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest. The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the Kolmogorov equations, the convergence to equilibrium and the passage time distributions to a state and to a subset of states. These results are applied to birth-and-death processes. He then proposes a detailed study of the uniformization technique by means of Banach algebra. This technique is used for the transient analysis of several queuing systems.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Bruno Sericola is a senior Research Scientist at Inria Rennes - Bretagne Atlantique in France. His main research activity is in performance evaluation of computer and communication systems, dependability analysis of fault-tolerant systems and stochastic models.
Inhaltsangabe
Preface ix Chapter 1. Discrete-Time Markov Chains 1 1.1. Definitions and properties 1 1.2. Strong Markov property 5 1.3. Recurrent and transient states 8 1.4. State classification 12 1.5. Visits to a state 14 1.6. State space decomposition 18 1.7. Irreducible and recurrent Markov chains 22 1.8. Aperiodic Markov chains 30 1.9. Convergence to equilibrium 34 1.10. Ergodic theorem 41 1.11. First passage times and number of visits 53 1.12. Finite Markov chains 68 1.13. Absorbing Markov chains 70 1.14. Examples 76 1.15. Bibliographical notes 87 Chapter 2. Continuous-Time Markov Chains 89 2.1. Definitions and properties 92 2.2. Transition functions and infinitesimal generator 93 2.3. Kolmogorov's backward equation 108 2.4. Kolmogorov's forward equation 114 2.5. Existence and uniqueness of the solutions 127 2.6. Recurrent and transient states 130 2.7. State classification 137 2.8. Explosion 141 2.9. Irreducible and recurrent Markov chains 148 2.10. Convergence to equilibrium 162 2.11. Ergodic theorem 166 2.12. First passage times 172 2.13. Absorbing Markov chains 184 2.14. Bibliographical notes 190 Chapter 3. Birth-and-Death Processes 191 3.1. Discrete-time birth-and-death processes 191 3.2. Absorbing discrete-time birth-and-death processes 200 3.3. Periodic discrete-time birth-and-death processes 208 3.4. Continuous-time pure birth processes 209 3.5. Continuous-time birth-and-death processes 213 3.6. Absorbing continuous-time birth-and-death processes 228 3.7. Bibliographical notes 233 Chapter 4. Uniformization 235 4.1. Introduction 235 4.2. Banach spaces and algebra 237 4.3. Infinite matrices and vectors 243 4.4. Poisson process 249 4.5. Uniformizable Markov chains 263 4.6. First passage time to a subset of states 273 4.7. Finite Markov chains 275 4.8. Transient regime 276 4.9. Bibliographical notes 286 Chapter 5. Queues 287 5.1. The M/M/1 queue 288 5.2. The M/M/c queue 315 5.3. The M/M/¿ queue 318 5.4. Phase-type distributions 323 5.5. Markovian arrival processes 326 5.6. Batch Markovian arrival process 342 5.7. Block-structured Markov chains 352 5.8. Applications 370 5.9. Bibliographical notes 380 Appendix 1 Basic Results 381 Bibliography 387 Index 395