This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Inhaltsangabe
Part I. Information measures: 1. Entropy 2. Divergence 3. Mutual information 4. Variational characterizations and continuity of information measures 5. Extremization of mutual information: capacity saddle point 6. Tensorization and information rates 7. f-divergences 8. Entropy method in combinatorics and geometry 9. Random number generators Part II. Lossless Data Compression: 10. Variable-length compression 11. Fixed-length compression and Slepian-Wolf theorem 12. Entropy of ergodic processes 13. Universal compression Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma 15. Information projection and large deviations 16. Hypothesis testing: error exponents Part IV. Channel Coding: 17. Error correcting codes 18. Random and maximal coding 19. Channel capacity 20. Channels with input constraints. Gaussian channels 21. Capacity per unit cost 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength 23. Channel coding with feedback Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory 25. Rate distortion: achievability bounds 26. Evaluating rate-distortion function. Lossy Source-Channel separation 27. Metric entropy Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics 30. Mutual information method 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation 33. Strong data processing inequality.
Part I. Information measures: 1. Entropy 2. Divergence 3. Mutual information 4. Variational characterizations and continuity of information measures 5. Extremization of mutual information: capacity saddle point 6. Tensorization and information rates 7. f-divergences 8. Entropy method in combinatorics and geometry 9. Random number generators Part II. Lossless Data Compression: 10. Variable-length compression 11. Fixed-length compression and Slepian-Wolf theorem 12. Entropy of ergodic processes 13. Universal compression Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma 15. Information projection and large deviations 16. Hypothesis testing: error exponents Part IV. Channel Coding: 17. Error correcting codes 18. Random and maximal coding 19. Channel capacity 20. Channels with input constraints. Gaussian channels 21. Capacity per unit cost 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength 23. Channel coding with feedback Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory 25. Rate distortion: achievability bounds 26. Evaluating rate-distortion function. Lossy Source-Channel separation 27. Metric entropy Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics 30. Mutual information method 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation 33. Strong data processing inequality.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497