MAXIMAL NON-SYMMETRIC ENTROPY LEADS NATURALLY TO ZIPF'S LAW

As the most fundamental empirical law, Zipf's law has been studied from many aspects. However, its meaning is still an open problem. Some models have been constructed to explain Zipf's law. In the letter, a concept of auxiliary information of an event is introduced, and a new entropy named non-symmetric entropy is defined. We prove that maximizing non-symmetric entropy leads naturally to Zipf's law under the special auxiliary information parameters. At the same time, if we take other auxiliary parameters we will get other distribution laws.

[1]  Martin A. Nowak,et al.  The evolution of syntactic communication , 2000, Nature.

[2]  Yuen Ren Chao,et al.  Human Behavior and the Principle of Least Effort: An Introduction to Human Ecology , 1950 .

[3]  H. Simon,et al.  ON A CLASS OF SKEW DISTRIBUTION FUNCTIONS , 1955 .

[4]  Ricard V. Solé,et al.  Least effort and the origins of scaling in human language , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[5]  G. Miller,et al.  Some effects of intermittent silence. , 1957, The American journal of psychology.

[6]  S. Naranan,et al.  Models for Power Law Relations in Linguistics and Information Science , 1998, J. Quant. Linguistics.