This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others. TOC:Introduction:Preliminaries; Miscellany; Some Probability; Matrices1. An Introduction to CodesStrings and Things; What are codes? Uniquely Decipherable Codes;Instantaneous Codes and Kraft's Theorem2. Efficient EncodingInformation Sources; Average Codeword Length; Huffman Encoding; TheProof that Huffman Encoding is the Most Efficient3. Noiseless CodingEntropy; Properties of Entropy; Extensions of an Information 1= Source; The Noiseless Coding TheoremII Coding Theory4. The Main Coding Theory ProblemCommunications Channels; Decision Rules; Nearest Neighbor Decoding;
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.