This book provides readers with the fundamentals of information theoretic techniques for statistical data science analyses and for characterizing the behavior and performance of a learning agent outside of the standard results on communications and compression fundamental limits. Readers will benefit from the presentation of information theoretic quantities, definitions, and results that provide or could provide insights into data science and learning.
This book provides readers with the fundamentals of information theoretic techniques for statistical data science analyses and for characterizing the behavior and performance of a learning agent outside of the standard results on communications and compression fundamental limits. Readers will benefit from the presentation of information theoretic quantities, definitions, and results that provide or could provide insights into data science and learning.
Produktdetails
Produktdetails
Synthesis Lectures on Engineering, Science, and Technology
Jerry D. Gibson is Professor of Electrical and Computer Engineering at the University of California, Santa Barbara. He has been an Associate Editor of the IEEE Transactions on Communications and the IEEE Transactions on Information Theory. He was an IEEE Communications Society Distinguished Lecturer for 2007-2008. He is an IEEE Fellow, and he has received The Fredrick Emmons Terman Award (1990), the 1993 IEEE Signal Processing Society Senior Paper Award, the 2009 IEEE Technical Committee on Wireless Communications Recognition Award, and the 2010 Best Paper Award from the IEEE Transactions on Multimedia. He is the author, coauthor, and editor of several books, the most recent of which are The Mobile Communications Handbook (Editor, 3rd ed., 2012), Rate Distortion Bounds for Voice and Video (Coauthor with Jing Hu, NOW Publishers, 2014), and Information Theory and Rate Distortion Theory for Communications and Compression (Morgan-Claypool, 2014). His research interests are lossy source coding, wireless communications and networks, and digital signal processing.
Inhaltsangabe
Background and Overview.- Entropy and Mutual Information.- Differential Entropy, Entropy Rate, and Maximum Entropy.- Typical Sequences and The AEP.- Markov Chains and Cascaded Systems.- Hypothesis Testing, Estimation, Information, and Sufficient Statistics.- Information Theoretic Quantities and Learning.- Estimation and Entropy Power.- Time Series Analyses.- Information Bottleneck Principle.- Channel Capacity.- Rate Distortion Theory.
Background and Overview.- Entropy and Mutual Information.- Differential Entropy, Entropy Rate, and Maximum Entropy.- Typical Sequences and The AEP.- Markov Chains and Cascaded Systems.- Hypothesis Testing, Estimation, Information, and Sufficient Statistics.- Information Theoretic Quantities and Learning.- Estimation and Entropy Power.- Time Series Analyses.- Information Bottleneck Principle.- Channel Capacity.- Rate Distortion Theory.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826