40,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 6-10 Tagen
payback
20 °P sammeln
  • Broschiertes Buch

Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification of information. Information theory is a broad and deep mathematical theory. Shannon introduced the quantitative and qualitative model of communication as a statistical process underlying information theory. Entropy optimization includes maximization and minimization. Maximization of entropy is easy and it can be done by using Lagrange's method since entropy is concave function. Due to the concavity minimization of entropy is not so simple. But calculation of minimum…mehr

Produktbeschreibung
Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification of information. Information theory is a broad and deep mathematical theory. Shannon introduced the quantitative and qualitative model of communication as a statistical process underlying information theory. Entropy optimization includes maximization and minimization. Maximization of entropy is easy and it can be done by using Lagrange's method since entropy is concave function. Due to the concavity minimization of entropy is not so simple. But calculation of minimum entropy probability distribution is necessary because knowledge of both maximum and minimum entropy probability distribution gives complete information. In the present book, Shannon entropy is minimized for given any two moments as constraints. As a particular case, minimum Shannon entropy for two moments Harmonic Mean and Harmonic Mean has been calculated for n [any value] and for six faced dice also.
Autorenporträt
Dr. Shalu Garg has obtained her Ph.D. degree in mathematics in 2013.She has worked at different topics in the field of information theory. She is author of five research papers published in international journals. Currently she is working with MITRC group, Alwar[Raj.].