Raymond W. Yeung
A First Course in Information Theory
Raymond W. Yeung
A First Course in Information Theory
- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
A First Course in Information Theory is an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.…mehr
Andere Kunden interessierten sich auch für
- Raymond W. YeungA First Course in Information Theory150,99 €
- Raymond W. YeungInformation Theory and Network Coding48,99 €
- James M. OoiCoding for Channels with Feedback74,99 €
- James M. OoiCoding for Channels with Feedback128,99 €
- John B. AndersonCoded Modulation Systems194,99 €
- Mario Blaum / Patrick G. Farrell / Henk C.A. van Tilborg (eds.)Information, Coding and Mathematics125,99 €
- Ding-Zhu DuSwitching Networks: Recent Advances128,99 €
-
-
-
A First Course in Information Theory is an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Information Technology: Transmission, Processing, and Storage
- Verlag: Springer Netherlands
- Artikelnr. des Verlages: 11124924
- 1st ed. 2002. Corr. 3rd printing. 2006
- Seitenzahl: 412
- Erscheinungstermin: 30. April 2002
- Englisch
- Abmessung: 241mm x 167mm x 30mm
- Gewicht: 905g
- ISBN-13: 9780306467912
- ISBN-10: 0306467917
- Artikelnr.: 20853573
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- 06621 890
- Information Technology: Transmission, Processing, and Storage
- Verlag: Springer Netherlands
- Artikelnr. des Verlages: 11124924
- 1st ed. 2002. Corr. 3rd printing. 2006
- Seitenzahl: 412
- Erscheinungstermin: 30. April 2002
- Englisch
- Abmessung: 241mm x 167mm x 30mm
- Gewicht: 905g
- ISBN-13: 9780306467912
- ISBN-10: 0306467917
- Artikelnr.: 20853573
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- 06621 890
Raymond W. Yeung, The Chinese University of Hong Kong, China
1. The Science of Information.
Information Measures.
2.1 Independence and Markov Chains.
2.2 Shannon's Information Measures.
2.3 Continuity of Shannon's Information Measures.
2.4 Chain Rules.
2.5 Informational Divergence.
2.6 The Basic Inequalities.
2.7 Some Useful Information Inequalities.
2.8 Fano's Inequality.
2.9 Entropy Rate of Stationary Source.
Problems.
Historical Notes.
3. Zero
Error Data Compression.
3.1 The Entropy Bound.
3.2 Prefix Codes.
3.3 Redundancy of Prefix Codes.
Problems.
Historical Notes.
4. Weak Typicality.
4.1 The Weak.
4.2 The Source Coding Theorem.
4.3 Efficient Source Coding.
4.4 The Shannon
McMillan
Breiman Theorem.
Problems.
Historical Notes.
5. Strong Typicality.
5.1 Strong.
5.2 Strong Typicality Versus Weak Typicality.
5.3 Joint Typicality.
5.4 An Interpretation of the Basic Inequalities.
Problems.
Historical Notes.
The I
measure.
6.1 Preliminaries.
6.2 The I
Measure for Two Random Variables.
6.3 Construction of the I
Measure ?*.
6.4 ?* Can be Negative.
6.5 Information Diagrams.
6.6 Examples of Applications.
Appendix 6.A: A Variation of the Inclusion
Exclusion Formula.
Problems.
Historical Notes.
7. Markov Structures.
7.1 Conditional Mutual Independence.
7.2 Full Conditional Mutual Independence.
7.3 Markov Random Field.
7.4 Markov Chain.
Problems.
Historical Notes.
8. Channel Capacity.
8.1 Discrete Memoryless Channels.
8.2 The Channel Coding Theorem.
8.3 The Converse.
8.4 Achievability of the Channel Capacity.
8.5 A Discussion.
8.6 Feedback Capacity.
8.7 Separation of Source and Channel Coding.
Problems.
Historical Notes.
9. Rate
Distortion Theory.
9.1 Single
Letter Distortion Measures.
9.2 The Rate
Distortion Function R(D).
9.3 The Rate
Distortion Theorem.
9.4 The Converse.
9.5 Achievability of RI(D).
Problems.
Historical Notes.
The Blahut
Arimoto Algorithms.
10.1 Alternating Optimization.
10.2 The Algorithms.
10.3 Convergence.
Problems.
Historical Notes.
11. Single
Source Network Coding.
11.1 A Point
to
Point Network.
11.2 What is Network Coding?.
11.3 A Network Code.
11.4 The Max
Flow Bound.
11.5 Achievability of the Max
Flow Bound.
Problems.
Historical Notes.
12. Information Inequalities.
12.1 The Region ?*n.
12.2 Information Expressions in Canonical Form.
12.3 A Geometrical Framework.
12.4 Equivalence of Constrained Inequalities.
12.5 The Implication Problem of Conditional Independence.
Problems.
Historical Notes.
13. Shannon
Type Inequalities.
13.1 The Elemental Inequalities.
13.2 A Linear Programming Approach.
13.3 A Duality.
13.4 Machine Proving.
13.5 Tackling the Implication Problem.
13.6 Minimality of the Elemental Inequalities.
Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.
Problems.
Historical Notes.
Problems.
Historical Notes.
14. Beyond Shannon
Type Inequalities.
14.1 Characterizations of ?*2,?*3, and ?*n.
14.2 A Non
Shannon
Type Unconstrained Inequality.
14.3 A Non
Shannon
TypeConstrained Inequality.
14.4 Applications.
Problems.
Historical Notes.
978
1
4419
8608
5_15.
15.1 Two Characteristics.
15.2 Examples of Application.
15.3 A Network Code for Acyclic Networks.
15.4 An Inner Bound.
15.5 An Outer Bound.
15.6 The LP Bound and Its Tightness.
15.7 Achievability of Rin.
Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.
Problems.
Historical Notes.
16. Entropy and Groups.
16.1 Group Preliminaries.
16.2 Group
Characterizable Entropy Functions.
16.3 A Group Characterization of ?*n.
16.4 Information Inequalities and Group Inequalities.
Problems.
Historical Notes.
Information Measures.
2.1 Independence and Markov Chains.
2.2 Shannon's Information Measures.
2.3 Continuity of Shannon's Information Measures.
2.4 Chain Rules.
2.5 Informational Divergence.
2.6 The Basic Inequalities.
2.7 Some Useful Information Inequalities.
2.8 Fano's Inequality.
2.9 Entropy Rate of Stationary Source.
Problems.
Historical Notes.
3. Zero
Error Data Compression.
3.1 The Entropy Bound.
3.2 Prefix Codes.
3.3 Redundancy of Prefix Codes.
Problems.
Historical Notes.
4. Weak Typicality.
4.1 The Weak.
4.2 The Source Coding Theorem.
4.3 Efficient Source Coding.
4.4 The Shannon
McMillan
Breiman Theorem.
Problems.
Historical Notes.
5. Strong Typicality.
5.1 Strong.
5.2 Strong Typicality Versus Weak Typicality.
5.3 Joint Typicality.
5.4 An Interpretation of the Basic Inequalities.
Problems.
Historical Notes.
The I
measure.
6.1 Preliminaries.
6.2 The I
Measure for Two Random Variables.
6.3 Construction of the I
Measure ?*.
6.4 ?* Can be Negative.
6.5 Information Diagrams.
6.6 Examples of Applications.
Appendix 6.A: A Variation of the Inclusion
Exclusion Formula.
Problems.
Historical Notes.
7. Markov Structures.
7.1 Conditional Mutual Independence.
7.2 Full Conditional Mutual Independence.
7.3 Markov Random Field.
7.4 Markov Chain.
Problems.
Historical Notes.
8. Channel Capacity.
8.1 Discrete Memoryless Channels.
8.2 The Channel Coding Theorem.
8.3 The Converse.
8.4 Achievability of the Channel Capacity.
8.5 A Discussion.
8.6 Feedback Capacity.
8.7 Separation of Source and Channel Coding.
Problems.
Historical Notes.
9. Rate
Distortion Theory.
9.1 Single
Letter Distortion Measures.
9.2 The Rate
Distortion Function R(D).
9.3 The Rate
Distortion Theorem.
9.4 The Converse.
9.5 Achievability of RI(D).
Problems.
Historical Notes.
The Blahut
Arimoto Algorithms.
10.1 Alternating Optimization.
10.2 The Algorithms.
10.3 Convergence.
Problems.
Historical Notes.
11. Single
Source Network Coding.
11.1 A Point
to
Point Network.
11.2 What is Network Coding?.
11.3 A Network Code.
11.4 The Max
Flow Bound.
11.5 Achievability of the Max
Flow Bound.
Problems.
Historical Notes.
12. Information Inequalities.
12.1 The Region ?*n.
12.2 Information Expressions in Canonical Form.
12.3 A Geometrical Framework.
12.4 Equivalence of Constrained Inequalities.
12.5 The Implication Problem of Conditional Independence.
Problems.
Historical Notes.
13. Shannon
Type Inequalities.
13.1 The Elemental Inequalities.
13.2 A Linear Programming Approach.
13.3 A Duality.
13.4 Machine Proving.
13.5 Tackling the Implication Problem.
13.6 Minimality of the Elemental Inequalities.
Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.
Problems.
Historical Notes.
Problems.
Historical Notes.
14. Beyond Shannon
Type Inequalities.
14.1 Characterizations of ?*2,?*3, and ?*n.
14.2 A Non
Shannon
Type Unconstrained Inequality.
14.3 A Non
Shannon
TypeConstrained Inequality.
14.4 Applications.
Problems.
Historical Notes.
978
1
4419
8608
5_15.
15.1 Two Characteristics.
15.2 Examples of Application.
15.3 A Network Code for Acyclic Networks.
15.4 An Inner Bound.
15.5 An Outer Bound.
15.6 The LP Bound and Its Tightness.
15.7 Achievability of Rin.
Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.
Problems.
Historical Notes.
16. Entropy and Groups.
16.1 Group Preliminaries.
16.2 Group
Characterizable Entropy Functions.
16.3 A Group Characterization of ?*n.
16.4 Information Inequalities and Group Inequalities.
Problems.
Historical Notes.
1. The Science of Information.- Information Measures.- 2.1 Independence and Markov Chains.- 2.2 Shannon's Information Measures.- 2.3 Continuity of Shannon's Information Measures.- 2.4 Chain Rules.- 2.5 Informational Divergence.- 2.6 The Basic Inequalities.- 2.7 Some Useful Information Inequalities.- 2.8 Fano's Inequality.- 2.9 Entropy Rate of Stationary Source.- Problems.- Historical Notes.- 3. Zero-Error Data Compression.- 3.1 The Entropy Bound.- 3.2 Prefix Codes.- 3.3 Redundancy of Prefix Codes.- Problems.- Historical Notes.- 4. Weak Typicality.- 4.1 The Weak.- 4.2 The Source Coding Theorem.- 4.3 Efficient Source Coding.- 4.4 The Shannon-McMillan-Breiman Theorem.- Problems.- Historical Notes.- 5. Strong Typicality.- 5.1 Strong.- 5.2 Strong Typicality Versus Weak Typicality.- 5.3 Joint Typicality.- 5.4 An Interpretation of the Basic Inequalities.- Problems.- Historical Notes.- The I-measure.- 6.1 Preliminaries.- 6.2 The I-Measure for Two Random Variables.- 6.3 Construction of the I-Measure ?*.- 6.4 ?* Can be Negative.- 6.5 Information Diagrams.- 6.6 Examples of Applications.- Appendix 6.A: A Variation of the Inclusion-Exclusion Formula.- Problems.- Historical Notes.- 7. Markov Structures.- 7.1 Conditional Mutual Independence.- 7.2 Full Conditional Mutual Independence.- 7.3 Markov Random Field.- 7.4 Markov Chain.- Problems.- Historical Notes.- 8. Channel Capacity.- 8.1 Discrete Memoryless Channels.- 8.2 The Channel Coding Theorem.- 8.3 The Converse.- 8.4 Achievability of the Channel Capacity.- 8.5 A Discussion.- 8.6 Feedback Capacity.- 8.7 Separation of Source and Channel Coding.- Problems.- Historical Notes.- 9. Rate-Distortion Theory.- 9.1 Single-Letter Distortion Measures.- 9.2 The Rate-Distortion Function R(D).- 9.3 The Rate-Distortion Theorem.- 9.4 The Converse.- 9.5 Achievability of RI(D).- Problems.- Historical Notes.- The Blahut-Arimoto Algorithms.- 10.1 Alternating Optimization.- 10.2 The Algorithms.- 10.3 Convergence.- Problems.- Historical Notes.- 11. Single-Source Network Coding.- 11.1 A Point-to-Point Network.- 11.2 What is Network Coding?.- 11.3 A Network Code.- 11.4 The Max-Flow Bound.- 11.5 Achievability of the Max-Flow Bound.- Problems.- Historical Notes.- 12. Information Inequalities.- 12.1 The Region ?*n.- 12.2 Information Expressions in Canonical Form.- 12.3 A Geometrical Framework.- 12.4 Equivalence of Constrained Inequalities.- 12.5 The Implication Problem of Conditional Independence.- Problems.- Historical Notes.- 13. Shannon-Type Inequalities.- 13.1 The Elemental Inequalities.- 13.2 A Linear Programming Approach.- 13.3 A Duality.- 13.4 Machine Proving.- 13.5 Tackling the Implication Problem.- 13.6 Minimality of the Elemental Inequalities.- Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.- Problems.- Historical Notes.- Problems.- Historical Notes.- 14. Beyond Shannon-Type Inequalities.- 14.1 Characterizations of ?*2,?*3, and ?*n.- 14.2 A Non-Shannon-Type Unconstrained Inequality.- 14.3 A Non-Shannon-TypeConstrained Inequality.- 14.4 Applications.- Problems.- Historical Notes.- 978-1-4419-8608-5_15.- 15.1 Two Characteristics.- 15.2 Examples of Application.- 15.3 A Network Code for Acyclic Networks.- 15.4 An Inner Bound.- 15.5 An Outer Bound.- 15.6 The LP Bound and Its Tightness.- 15.7 Achievability of Rin.- Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.- Problems.- Historical Notes.- 16. Entropy and Groups.- 16.1 Group Preliminaries.- 16.2 Group-Characterizable Entropy Functions.- 16.3 A Group Characterization of ?*n.- 16.4 Information Inequalities and Group Inequalities.- Problems.- Historical Notes.
1. The Science of Information.
Information Measures.
2.1 Independence and Markov Chains.
2.2 Shannon's Information Measures.
2.3 Continuity of Shannon's Information Measures.
2.4 Chain Rules.
2.5 Informational Divergence.
2.6 The Basic Inequalities.
2.7 Some Useful Information Inequalities.
2.8 Fano's Inequality.
2.9 Entropy Rate of Stationary Source.
Problems.
Historical Notes.
3. Zero
Error Data Compression.
3.1 The Entropy Bound.
3.2 Prefix Codes.
3.3 Redundancy of Prefix Codes.
Problems.
Historical Notes.
4. Weak Typicality.
4.1 The Weak.
4.2 The Source Coding Theorem.
4.3 Efficient Source Coding.
4.4 The Shannon
McMillan
Breiman Theorem.
Problems.
Historical Notes.
5. Strong Typicality.
5.1 Strong.
5.2 Strong Typicality Versus Weak Typicality.
5.3 Joint Typicality.
5.4 An Interpretation of the Basic Inequalities.
Problems.
Historical Notes.
The I
measure.
6.1 Preliminaries.
6.2 The I
Measure for Two Random Variables.
6.3 Construction of the I
Measure ?*.
6.4 ?* Can be Negative.
6.5 Information Diagrams.
6.6 Examples of Applications.
Appendix 6.A: A Variation of the Inclusion
Exclusion Formula.
Problems.
Historical Notes.
7. Markov Structures.
7.1 Conditional Mutual Independence.
7.2 Full Conditional Mutual Independence.
7.3 Markov Random Field.
7.4 Markov Chain.
Problems.
Historical Notes.
8. Channel Capacity.
8.1 Discrete Memoryless Channels.
8.2 The Channel Coding Theorem.
8.3 The Converse.
8.4 Achievability of the Channel Capacity.
8.5 A Discussion.
8.6 Feedback Capacity.
8.7 Separation of Source and Channel Coding.
Problems.
Historical Notes.
9. Rate
Distortion Theory.
9.1 Single
Letter Distortion Measures.
9.2 The Rate
Distortion Function R(D).
9.3 The Rate
Distortion Theorem.
9.4 The Converse.
9.5 Achievability of RI(D).
Problems.
Historical Notes.
The Blahut
Arimoto Algorithms.
10.1 Alternating Optimization.
10.2 The Algorithms.
10.3 Convergence.
Problems.
Historical Notes.
11. Single
Source Network Coding.
11.1 A Point
to
Point Network.
11.2 What is Network Coding?.
11.3 A Network Code.
11.4 The Max
Flow Bound.
11.5 Achievability of the Max
Flow Bound.
Problems.
Historical Notes.
12. Information Inequalities.
12.1 The Region ?*n.
12.2 Information Expressions in Canonical Form.
12.3 A Geometrical Framework.
12.4 Equivalence of Constrained Inequalities.
12.5 The Implication Problem of Conditional Independence.
Problems.
Historical Notes.
13. Shannon
Type Inequalities.
13.1 The Elemental Inequalities.
13.2 A Linear Programming Approach.
13.3 A Duality.
13.4 Machine Proving.
13.5 Tackling the Implication Problem.
13.6 Minimality of the Elemental Inequalities.
Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.
Problems.
Historical Notes.
Problems.
Historical Notes.
14. Beyond Shannon
Type Inequalities.
14.1 Characterizations of ?*2,?*3, and ?*n.
14.2 A Non
Shannon
Type Unconstrained Inequality.
14.3 A Non
Shannon
TypeConstrained Inequality.
14.4 Applications.
Problems.
Historical Notes.
978
1
4419
8608
5_15.
15.1 Two Characteristics.
15.2 Examples of Application.
15.3 A Network Code for Acyclic Networks.
15.4 An Inner Bound.
15.5 An Outer Bound.
15.6 The LP Bound and Its Tightness.
15.7 Achievability of Rin.
Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.
Problems.
Historical Notes.
16. Entropy and Groups.
16.1 Group Preliminaries.
16.2 Group
Characterizable Entropy Functions.
16.3 A Group Characterization of ?*n.
16.4 Information Inequalities and Group Inequalities.
Problems.
Historical Notes.
Information Measures.
2.1 Independence and Markov Chains.
2.2 Shannon's Information Measures.
2.3 Continuity of Shannon's Information Measures.
2.4 Chain Rules.
2.5 Informational Divergence.
2.6 The Basic Inequalities.
2.7 Some Useful Information Inequalities.
2.8 Fano's Inequality.
2.9 Entropy Rate of Stationary Source.
Problems.
Historical Notes.
3. Zero
Error Data Compression.
3.1 The Entropy Bound.
3.2 Prefix Codes.
3.3 Redundancy of Prefix Codes.
Problems.
Historical Notes.
4. Weak Typicality.
4.1 The Weak.
4.2 The Source Coding Theorem.
4.3 Efficient Source Coding.
4.4 The Shannon
McMillan
Breiman Theorem.
Problems.
Historical Notes.
5. Strong Typicality.
5.1 Strong.
5.2 Strong Typicality Versus Weak Typicality.
5.3 Joint Typicality.
5.4 An Interpretation of the Basic Inequalities.
Problems.
Historical Notes.
The I
measure.
6.1 Preliminaries.
6.2 The I
Measure for Two Random Variables.
6.3 Construction of the I
Measure ?*.
6.4 ?* Can be Negative.
6.5 Information Diagrams.
6.6 Examples of Applications.
Appendix 6.A: A Variation of the Inclusion
Exclusion Formula.
Problems.
Historical Notes.
7. Markov Structures.
7.1 Conditional Mutual Independence.
7.2 Full Conditional Mutual Independence.
7.3 Markov Random Field.
7.4 Markov Chain.
Problems.
Historical Notes.
8. Channel Capacity.
8.1 Discrete Memoryless Channels.
8.2 The Channel Coding Theorem.
8.3 The Converse.
8.4 Achievability of the Channel Capacity.
8.5 A Discussion.
8.6 Feedback Capacity.
8.7 Separation of Source and Channel Coding.
Problems.
Historical Notes.
9. Rate
Distortion Theory.
9.1 Single
Letter Distortion Measures.
9.2 The Rate
Distortion Function R(D).
9.3 The Rate
Distortion Theorem.
9.4 The Converse.
9.5 Achievability of RI(D).
Problems.
Historical Notes.
The Blahut
Arimoto Algorithms.
10.1 Alternating Optimization.
10.2 The Algorithms.
10.3 Convergence.
Problems.
Historical Notes.
11. Single
Source Network Coding.
11.1 A Point
to
Point Network.
11.2 What is Network Coding?.
11.3 A Network Code.
11.4 The Max
Flow Bound.
11.5 Achievability of the Max
Flow Bound.
Problems.
Historical Notes.
12. Information Inequalities.
12.1 The Region ?*n.
12.2 Information Expressions in Canonical Form.
12.3 A Geometrical Framework.
12.4 Equivalence of Constrained Inequalities.
12.5 The Implication Problem of Conditional Independence.
Problems.
Historical Notes.
13. Shannon
Type Inequalities.
13.1 The Elemental Inequalities.
13.2 A Linear Programming Approach.
13.3 A Duality.
13.4 Machine Proving.
13.5 Tackling the Implication Problem.
13.6 Minimality of the Elemental Inequalities.
Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.
Problems.
Historical Notes.
Problems.
Historical Notes.
14. Beyond Shannon
Type Inequalities.
14.1 Characterizations of ?*2,?*3, and ?*n.
14.2 A Non
Shannon
Type Unconstrained Inequality.
14.3 A Non
Shannon
TypeConstrained Inequality.
14.4 Applications.
Problems.
Historical Notes.
978
1
4419
8608
5_15.
15.1 Two Characteristics.
15.2 Examples of Application.
15.3 A Network Code for Acyclic Networks.
15.4 An Inner Bound.
15.5 An Outer Bound.
15.6 The LP Bound and Its Tightness.
15.7 Achievability of Rin.
Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.
Problems.
Historical Notes.
16. Entropy and Groups.
16.1 Group Preliminaries.
16.2 Group
Characterizable Entropy Functions.
16.3 A Group Characterization of ?*n.
16.4 Information Inequalities and Group Inequalities.
Problems.
Historical Notes.
1. The Science of Information.- Information Measures.- 2.1 Independence and Markov Chains.- 2.2 Shannon's Information Measures.- 2.3 Continuity of Shannon's Information Measures.- 2.4 Chain Rules.- 2.5 Informational Divergence.- 2.6 The Basic Inequalities.- 2.7 Some Useful Information Inequalities.- 2.8 Fano's Inequality.- 2.9 Entropy Rate of Stationary Source.- Problems.- Historical Notes.- 3. Zero-Error Data Compression.- 3.1 The Entropy Bound.- 3.2 Prefix Codes.- 3.3 Redundancy of Prefix Codes.- Problems.- Historical Notes.- 4. Weak Typicality.- 4.1 The Weak.- 4.2 The Source Coding Theorem.- 4.3 Efficient Source Coding.- 4.4 The Shannon-McMillan-Breiman Theorem.- Problems.- Historical Notes.- 5. Strong Typicality.- 5.1 Strong.- 5.2 Strong Typicality Versus Weak Typicality.- 5.3 Joint Typicality.- 5.4 An Interpretation of the Basic Inequalities.- Problems.- Historical Notes.- The I-measure.- 6.1 Preliminaries.- 6.2 The I-Measure for Two Random Variables.- 6.3 Construction of the I-Measure ?*.- 6.4 ?* Can be Negative.- 6.5 Information Diagrams.- 6.6 Examples of Applications.- Appendix 6.A: A Variation of the Inclusion-Exclusion Formula.- Problems.- Historical Notes.- 7. Markov Structures.- 7.1 Conditional Mutual Independence.- 7.2 Full Conditional Mutual Independence.- 7.3 Markov Random Field.- 7.4 Markov Chain.- Problems.- Historical Notes.- 8. Channel Capacity.- 8.1 Discrete Memoryless Channels.- 8.2 The Channel Coding Theorem.- 8.3 The Converse.- 8.4 Achievability of the Channel Capacity.- 8.5 A Discussion.- 8.6 Feedback Capacity.- 8.7 Separation of Source and Channel Coding.- Problems.- Historical Notes.- 9. Rate-Distortion Theory.- 9.1 Single-Letter Distortion Measures.- 9.2 The Rate-Distortion Function R(D).- 9.3 The Rate-Distortion Theorem.- 9.4 The Converse.- 9.5 Achievability of RI(D).- Problems.- Historical Notes.- The Blahut-Arimoto Algorithms.- 10.1 Alternating Optimization.- 10.2 The Algorithms.- 10.3 Convergence.- Problems.- Historical Notes.- 11. Single-Source Network Coding.- 11.1 A Point-to-Point Network.- 11.2 What is Network Coding?.- 11.3 A Network Code.- 11.4 The Max-Flow Bound.- 11.5 Achievability of the Max-Flow Bound.- Problems.- Historical Notes.- 12. Information Inequalities.- 12.1 The Region ?*n.- 12.2 Information Expressions in Canonical Form.- 12.3 A Geometrical Framework.- 12.4 Equivalence of Constrained Inequalities.- 12.5 The Implication Problem of Conditional Independence.- Problems.- Historical Notes.- 13. Shannon-Type Inequalities.- 13.1 The Elemental Inequalities.- 13.2 A Linear Programming Approach.- 13.3 A Duality.- 13.4 Machine Proving.- 13.5 Tackling the Implication Problem.- 13.6 Minimality of the Elemental Inequalities.- Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.- Problems.- Historical Notes.- Problems.- Historical Notes.- 14. Beyond Shannon-Type Inequalities.- 14.1 Characterizations of ?*2,?*3, and ?*n.- 14.2 A Non-Shannon-Type Unconstrained Inequality.- 14.3 A Non-Shannon-TypeConstrained Inequality.- 14.4 Applications.- Problems.- Historical Notes.- 978-1-4419-8608-5_15.- 15.1 Two Characteristics.- 15.2 Examples of Application.- 15.3 A Network Code for Acyclic Networks.- 15.4 An Inner Bound.- 15.5 An Outer Bound.- 15.6 The LP Bound and Its Tightness.- 15.7 Achievability of Rin.- Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.- Problems.- Historical Notes.- 16. Entropy and Groups.- 16.1 Group Preliminaries.- 16.2 Group-Characterizable Entropy Functions.- 16.3 A Group Characterization of ?*n.- 16.4 Information Inequalities and Group Inequalities.- Problems.- Historical Notes.