Towards a general theory of information II: information and entropy

Future developments of computer systems will be handicapped not by the limitations of hardware, but by our lack of understanding of the human reasoning processes. The development of three‐dimensional chips, cryogenic superconducting, or optical systems — and in due course, biological computers — presages the emergence of generations of super information processors whose power will dwarf the present generation of devices as they, in turn, have dwarfed the capacity of the computers of the pre‐transistor age. The effective application of such powerful future computers will be limited by the lack of an adequate theoretical basis for the processing of information. Gordon Scarrott has championed the need for a ‘science of information’ which should investigate the ‘natural properties of information such as function, structure, dynamic behaviour and statistical features…’ Such an effort should ‘… lead to a conceptual framework to guide systems design.’