Yury Polyanskiy (Massachusetts Institute of Technology), Yihong Wu (Connecticut Yale University)
Information Theory
From Coding to Learning
Yury Polyanskiy (Massachusetts Institute of Technology), Yihong Wu (Connecticut Yale University)
Information Theory
From Coding to Learning
- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
Andere Kunden interessierten sich auch für
- Khalid SayoodIntroduction to Data Compression132,99 €
- Joseph M. Powers (Indiana University of Notre Dame)Mechanics of Fluids123,99 €
- Dragan Huterer (Ann Arbor University of Michigan)A Course in Cosmology52,99 €
- Itzhak Gilboa (Tel-Aviv University)Theory of Decision Under Uncertainty40,99 €
- Itzhak Gilboa (Tel-Aviv University)Theory of Decision Under Uncertainty74,99 €
- Bertie J. Weddell (Washington State University)Conservation in the Context of a Changing World69,99 €
- Qamrul Hasan Ansari (India Aligarh Muslim University)Fixed Point Theory and Variational Principles in Metric Spaces132,99 €
-
-
-
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Cambridge University Press
- Seitenzahl: 748
- Erscheinungstermin: 20. Februar 2025
- Englisch
- Abmessung: 249mm x 175mm x 41mm
- Gewicht: 1680g
- ISBN-13: 9781108832908
- ISBN-10: 1108832903
- Artikelnr.: 70725904
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
- Verlag: Cambridge University Press
- Seitenzahl: 748
- Erscheinungstermin: 20. Februar 2025
- Englisch
- Abmessung: 249mm x 175mm x 41mm
- Gewicht: 1680g
- ISBN-13: 9781108832908
- ISBN-10: 1108832903
- Artikelnr.: 70725904
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Part I. Information measures: 1. Entropy
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
Part I. Information measures: 1. Entropy
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.