This book is based on lectures given by the author at the IBM European Systems Research Institute (ESRI) in Geneva. Information Theory on the syntactic level, as introduced by Claude Shannon in 1949, has many limitations when applied to information processing by computers. But in spite of some obvious shortcomings, the underlyin~ principles are of fundamental importance for systems engineers in understanding the nature of the problems of handling information, its acquisition, storage, processing, and interpretation. The lectures, as presented in this book, attempt to give an exposition of the lovical foundation and basic principles, and to provide at the same time a basis for further study in more specific areas of this expan1in~ theory, such as coding, detection, pattern recognition, and filtering. Most of the problems in Appendix C are intended as extensions of the text, while calling for actjve participation by the stu1ent. Some other problems are direct applications of the theory to specific situations. Some problems require extensive numerical calculations. It is assumed in those cases that the student has access to a computer and that he is capable of writing the necessary programs. The stu1ent is assumed to have a good command of the calculus, and of the theory of probability as well as statistics. Therefore no basic mathematical concepts are discussed in this IV book. The Fourier transform and some related mathematical concepts are introduced in Appendix A.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.