106,99 €
inkl. MwSt.
Sofort per Download lieferbar
payback
0 °P sammeln
  • Format: PDF

Mythanksareduetothemanypeoplewhohaveassistedintheworkreported here and in the preparation of this book. The work is incomplete and this account of it rougher than it might be. Such virtues as it has owe much to others; the faults are all mine. MyworkleadingtothisbookbeganwhenDavidBoultonandIattempted to develop a method for intrinsic classi?cation. Given data on a sample from some population, we aimed to discover whether the population should be considered to be a mixture of di?erent types, classes or species of thing, and, if so, how many classes were present, what each class looked like, and…mehr

Produktbeschreibung
Mythanksareduetothemanypeoplewhohaveassistedintheworkreported here and in the preparation of this book. The work is incomplete and this account of it rougher than it might be. Such virtues as it has owe much to others; the faults are all mine. MyworkleadingtothisbookbeganwhenDavidBoultonandIattempted to develop a method for intrinsic classi?cation. Given data on a sample from some population, we aimed to discover whether the population should be considered to be a mixture of di?erent types, classes or species of thing, and, if so, how many classes were present, what each class looked like, and which things in the sample belonged to which class. I saw the problem as one of Bayesian inference, but with prior probability densities replaced by discrete probabilities re?ecting the precision to which the data would allow parameters to be estimated. Boulton, however, proposed that a classi?cation of the sample was a way of brie?y encoding the data: once each class was described and each thing assigned to a class, the data for a thing would be partially implied by the characteristics of its class, and hence require little further description. After some weeks’ arguing our cases, we decided on the maths for each approach, and soon discovered they gave essentially the same results. Without Boulton’s insight, we may never have made the connection between inference and brief encoding, which is the heart of this work.
Autorenporträt
C.S. Wallace was appointed Foundation Chair of Computer Science at Monash University in 1968, at the age of 35, where he worked until his death in 2004. He received an ACM Fellowship in 1995, and was appointed Professor Emeritus in 1996. Professor Wallace made numerous significant contributions to diverse areas of Computer Science, such as Computer Architecture, Simulation and Machine Learning. His final research focused primarily on the Minimum Message Length Principle.

Rezensionen
From the reviews: "The subject matter is highly technical, and the book is correspondingly detailed. The book is intended for graduate-level courses, and should be effective in that role if the instructor is sufficiently expert in the area. For researchers at the postdoctoral level, the book will provide a wealth of information about the field.... [T]he book is likely to remain the primary reference in the field for many years to come." (Donald RICHARDS, JASA, June 2009, Vol. 104, No. 486) "Any statistician interested in the foundations of the discipline, or the deeper philosophical issues of inference, will find this volume a rewarding read." (International Statistical Institute, December 2005) "This very significant monograph covers the topic of the Minimum Message Length (MML) principle, a new approach to induction, hypothesis testing, model selection, and statistical inference. ... This valuable book covers the topics at a level suitable for professionals and graduate students in Statistics, Computer Science, Data Mining, Machine Learning, Estimation and Model-selection, Econometrics etc." (Jerzy Martyna, Zentralblatt MATH, Vol. 1085, 2006) "This book is around a simple idea: 'The best explanation of the facts is the shortest'. ... The book applies the above idea to statistical estimation in a Bayesian context. ... I think it will be valuable for readers who have at the same time strong interest in Bayesian decision theory and in Shannon information theory." (Michael Kohler, Metrika, Vol. 64, 2006)…mehr