A theory is proposed which considers information to be a basic property of the universe the way matter and energy are. Operationally--just as energy is defined in terms of its capacity to perform work--so is information defined in terms of its capacity to organize a system. Pure energy can perform no 'useful' (entropy reducing) work without a concomitant input of information. Conversely, all expenditures of energy lead to a reorganization of the universe, hence to a change in its information status. Energy and information are interconvertible; physicists have been able to ignore the information parameter principally for two major reasons. First, historically, just as there was no need to define energy prior to the advent of increasingly complex, powered machinery and cannons (Galileo was a military engineer), so was there no need until the 20th Century to define information. It was the telephone engineers who first preoccupied themselves with developing a theory of information. The second reason is that physicists invented accounting devices such as potential energy and entropy to explain the apparent disappearance of energy yet maintain the law of the conservation of energy. The proposed theory would consider that what is conserved is the sum of information and energy. The mathematical relationship between information and entropy is provided by the equation: I = (Io)e-S/k while the conversion of energy into information involves the relationship: 1 J/degree K = 10(23) bits (approximately) Acceptance of the theory would require paradigm shifts in a number of interrelated areas.
[1]
Tom Stonier,et al.
Information and the Internal Structure of the Universe
,
1990,
Springer London.
[2]
R. G. Lerner,et al.
Encyclopedia of Physics
,
1990
.
[3]
Claude E. Shannon,et al.
The mathematical theory of communication
,
1950
.
[4]
C. E. SHANNON,et al.
A mathematical theory of communication
,
1948,
MOCO.
[5]
Claude E. Shannon,et al.
The Mathematical Theory of Communication
,
1950
.
[6]
L. Szilard.
über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen
,
1929
.
[7]
C. Cherry,et al.
On human communication
,
1966
.
[8]
E. Schrödinger.
What Is Life
,
1946
.
[9]
J. S. Wicken.
Entropy and Information: Suggestions for Common Language
,
1987,
Philosophy of Science.
[10]
Klaus Haefner,et al.
Evolution of Information Processing Systems
,
1992,
Springer Berlin Heidelberg.