Presents a useful new technique for analyzing the extreme-value behaviour of random fields Modern science typically involves the analysis of increasingly complex data. The extreme values that emerge in the statistical analysis of complex data are often of particular interest. This book focuses on the analytical approximations of the statistical significance of extreme values. Several relatively complex applications of the technique to problems that emerge in practical situations are presented. All the examples are difficult to analyze using classical methods, and as a result, the author…mehr
Presents a useful new technique for analyzing the extreme-value behaviour of random fields Modern science typically involves the analysis of increasingly complex data. The extreme values that emerge in the statistical analysis of complex data are often of particular interest. This book focuses on the analytical approximations of the statistical significance of extreme values. Several relatively complex applications of the technique to problems that emerge in practical situations are presented. All the examples are difficult to analyze using classical methods, and as a result, the author presents a novel technique, designed to be more accessible to the user. Extreme value analysis is widely applied in areas such as operational research, bioinformatics, computer science, finance and many other disciplines. This book will be useful for scientists, engineers and advanced graduate students who need to develop their own statistical tools for the analysis of their data. Whilst this book may not provide the reader with the specific answer it will inspire them to rethink their problem in the context of random fields, apply the method, and produce a solution.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Benjamin Yakir, Department of Statistics, The Hebrew University of Jerusalem, Israel
Inhaltsangabe
Preface xi Acknowledgments xv Part I THEORY 1 1 Introduction 3 1.1 Distribution of extremes in random fields 3 1.2 Outline of the method 7 1.3 Gaussian and asymptotically Gaussian random fields 9 1.4 Applications 11 2 Basic examples 15 2.1 Introduction 15 2.2 A power-one sequential test 15 2.3 A kernel-based scanning statistic 24 2.4 Other methods 38 3 Approximation of the local rate 41 3.1 Introduction 41 3.2 Preliminary localization and approximation 43 3.3 Measure transformation 51 3.4 Application of the localization theorem 55 3.5 Integration 4 From the local to the global 71 4.1 Introduction 71 4.2 Poisson approximation of probabilities 72 4.3 Average run length to false alarm 78 5 The localization theorem 87 5.1 Introduction 87 5.2 A simplified version of the localization theorem 88 5.3 The localization theorem 90 5.4 A local limit theorem 95 5.5 Edge effects and higher order approximations 100 Part II APPLICATIONS 103 6 Nonparametric tests: Kolmogorov-Smirnov and Peacock 105 6.1 Introduction 105 6.2 Analysis of the one-dimensional case 109 6.3 Peacock's test 120 6.4 Relations to scanning statistics 123 7 Copy number variations 125 7.1 Introduction 125 7.2 The statistical model 127 7.3 Analysis of statistical properties 131 7.4 The false discovery rate 140 8 Sequential monitoring of an image 143 8.1 Introduction 143 8.2 The statistical model 146 8.3 Analysis of statistical properties 148 8.4 Optimal change-point detection 161 9 Buffer overflow 165 9.1 Introduction 165 9.2 The statistical model 169 9.3 Analysis of statistical properties 172 9.4 Heavy tail distribution, long-range dependence, and self-similarity 186 10 Computing Pickands' constants 191 10.1 Introduction 191 10.2 Representations of constants 196 10.3 Analysis of statistical error 199 10.4 Enumerating the effect of local fluctuations 204 Appendix: Mathematical background 209 A.1 Transforms 209 A.2 Approximations of sum of independent random elements 211 A.3 Concentration inequalities 214 A.4 Random walks 215 A.5 Renewal theory 215 A.6 The Gaussian distribution 216 A.7 Large sample inference 217 A.8 Integration 218 A.9 Poisson approximation 219 A.10 Convexity 220 References 221 Index 223
Preface xi Acknowledgments xv Part I THEORY 1 1 Introduction 3 1.1 Distribution of extremes in random fields 3 1.2 Outline of the method 7 1.3 Gaussian and asymptotically Gaussian random fields 9 1.4 Applications 11 2 Basic examples 15 2.1 Introduction 15 2.2 A power-one sequential test 15 2.3 A kernel-based scanning statistic 24 2.4 Other methods 38 3 Approximation of the local rate 41 3.1 Introduction 41 3.2 Preliminary localization and approximation 43 3.3 Measure transformation 51 3.4 Application of the localization theorem 55 3.5 Integration 4 From the local to the global 71 4.1 Introduction 71 4.2 Poisson approximation of probabilities 72 4.3 Average run length to false alarm 78 5 The localization theorem 87 5.1 Introduction 87 5.2 A simplified version of the localization theorem 88 5.3 The localization theorem 90 5.4 A local limit theorem 95 5.5 Edge effects and higher order approximations 100 Part II APPLICATIONS 103 6 Nonparametric tests: Kolmogorov-Smirnov and Peacock 105 6.1 Introduction 105 6.2 Analysis of the one-dimensional case 109 6.3 Peacock's test 120 6.4 Relations to scanning statistics 123 7 Copy number variations 125 7.1 Introduction 125 7.2 The statistical model 127 7.3 Analysis of statistical properties 131 7.4 The false discovery rate 140 8 Sequential monitoring of an image 143 8.1 Introduction 143 8.2 The statistical model 146 8.3 Analysis of statistical properties 148 8.4 Optimal change-point detection 161 9 Buffer overflow 165 9.1 Introduction 165 9.2 The statistical model 169 9.3 Analysis of statistical properties 172 9.4 Heavy tail distribution, long-range dependence, and self-similarity 186 10 Computing Pickands' constants 191 10.1 Introduction 191 10.2 Representations of constants 196 10.3 Analysis of statistical error 199 10.4 Enumerating the effect of local fluctuations 204 Appendix: Mathematical background 209 A.1 Transforms 209 A.2 Approximations of sum of independent random elements 211 A.3 Concentration inequalities 214 A.4 Random walks 215 A.5 Renewal theory 215 A.6 The Gaussian distribution 216 A.7 Large sample inference 217 A.8 Integration 218 A.9 Poisson approximation 219 A.10 Convexity 220 References 221 Index 223
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497