TOWARDS A NEW INFORMATION THEORY
暂无分享,去创建一个
The modern information theory is established by Claude Shannon around the middle of the past century. The key challenges at the time were how to ensure a reliable transmission of signals. A prototype example is enough to illustrate the point. A signal represented by a binary sequence is to be transmitted over a noisy channel, the channel may randomly flip the bit with a given error rate. At the receiver side we’d like to recover the original message correctly, what to do? Since there is no miracle the only way to surmount the difficulty is to send the original bits more than once, in the hope the receiver can figure out the correct original bit. If resource is infinite, the solution is simple: repeat each bit infinite number of times and a simple average on the receiver’s side suffices to recover the exact original message. Infinite resources, however, never exist in reality, the work by Shannon and his disciples was to find the minimal necessary number of redundancies in order to recover, or decode the correct signal,if the resources are below this minimal requirement, what best approximation can one obtain. Shannon’s Information theory provides a general theoretical framework to construct the most efficient information filter for noisy signals. With the current rapid advances in information technology, especially with the advent the Internet, there is much more available information for people to be able to reliably select what is relevant and important for them. To cope with such an ‘information explosive growth’, search engines play a pivotal role. Current the most popular search engine is Google, which demonstrated in a brief time span how much difference a more powerful information filtering mechanism can make. In this essay I’ll outline a theoretical framework akin to Shannon information theory, which tackles the basic mechanism ∗Department of Physics, University of Fribourg, CH-1700, Fribourg, Switzerland. yi-cheng.zhang@unifr.ch