As individual needs have arisen in the fields of physics, electrical engineering and computational science, each has created its own theories of information to serve as conceptual instruments for advancing developments. This book provides a coherent consolidation of information theories from these different fields. The author gives a survey of current theories and then introduces the underlying notion of symmetry, showing how information is related to the capacity of a system to distinguish itself. A formal methodology using group theory is employed and leads to the application of Burnside's Lemma to count distinguishable states. This provides a versatile tool for quantifying complexity and information capacity in any physical system. Written in an informal style, the book is accessible to all researchers in the fields of physics, chemistry, biology, computational science as well as many others.
From the reviews: "The author is concerned with the meaning of the term 'information'. He discusses theories of information that arise in thermodynamics and statistical mechanics, communication theory, and in complexity theory. ... The book is more a contribution to epistemology ... ." (L. L. Campbell, Mathematical Reviews, Issue 2008 k)