An objective, dynamic and physically justified concept of information is elaborated starting from Shannon's concept of entropy and applied to information technology, artificial intelligence (consciousness) and thermodynamics. The justification of an information conservation theorem acquires practical significance in information technology, especially as it moves into the quantum realm (photonics/quantum computing). The unconditional dynamics of information and its objectivity are critically examined and are the foundations of the considerations.
We live in the information age, but the concept of information is still not defined objectively and physically. This book defines a dynamic concept of information that results in a conservation of information principle. Just as the principle of conservation of energy is essential to understanding energy, the principle of conservation of information leads to a deeper understanding of information.
Information is strongly related to entropy, always in motion, cannot disappear, and is independent of subjects.
We live in the information age, but the concept of information is still not defined objectively and physically. This book defines a dynamic concept of information that results in a conservation of information principle. Just as the principle of conservation of energy is essential to understanding energy, the principle of conservation of information leads to a deeper understanding of information.
Information is strongly related to entropy, always in motion, cannot disappear, and is independent of subjects.
"The title, Information is energy, intentionally challenges the longstanding view in information science that information is entropy. ... as a collection of references on the relation of computation and physics, and will certainly contribute to further, more systematic work in this field." (H. Van Dyke Parunak, Computing Reviews, April 8, 2024)