INFORMATION AND COMPLEXITY ( HOW TO MEASURE THEM ? )
暂无分享,去创建一个
Computer science, also called informatics, is often defined as the theory of storing, processing, and communicating information. The key notion in the theory of computing is that of the complexity. The basic tasks of computer science (and their variations) lead to various measures of complexity. We may speak of the complexity of a structure, meaning the amount of information (number of bits) in the most economical " blueprint " of the structure; this is the minimum space we need to store enough information about the structure that allows us its reconstruction. We may also speak of the algorithmic complexity of a certain task: this is the minimum time (or other computational resource) needed to carry out this task on a computer. And we may also speak of the communication complexity of tasks involving more than one processor: this is the number of bits that have to be transmitted in solving this task (I will not discuss this last notion in these notes). It is important to emphasize that the notion of the theory of computing (algorithms, encodings, machine models, complexity) can be defined and measured in a mathematically precise way. The resulting theory is as exact as euclidean geometry. The elaboration of the mathematical theory would, of course, be beyond these notes; but I hope that I can sketch the motivation for introducing these complexity measures and indicate their possible interest in various areas. Complexity, I believe, should play a central role in the study of a large variety of phenomena, from computers to genetics to brain research to statistical mechanics. In fact, these mathematical ideas and tools may prove as important in the life sciences as the tools of classical mathematics (calculus and algebra) have proved in physics and chemistry. As most phenomena of the world, complexity appears first as an obstacle in the way of knowledge (or as a convenient excuse to ignorance). As a next phase, we begin to understand it, measure it, determine its the laws and its connections to our previous knowledge. Finally, we make use of it in engineering: complexity has reached this level, it is widely used in cryptography, random number generation, data security and other areas. Some of these aspects are discussed by Adi Shamir in this volume.
[1] B. Dickinson,et al. The complexity of analog computation , 1986 .
[2] Per Martin-Löf,et al. The Definition of Random Sequences , 1966, Inf. Control..
[3] Stephen A. Cook,et al. The complexity of theorem-proving procedures , 1971, STOC.
[4] A. Kolmogorov. Three approaches to the quantitative definition of information , 1968 .
[5] David S. Johnson,et al. Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .