Information Theory and Language

Human language is a system of communication [...].

[1]  Carolin Müller-Spitzer,et al.  Studying Lexical Dynamics and Language Change via Generalized Entropies: The Problem of Sample Size , 2019, Entropy.

[2]  Kumiko Tanaka-Ishii,et al.  Entropy Rate Estimation for English via a Large Cognitive Experiment Using Mechanical Turk , 2019, Entropy.

[3]  Ricard Solé,et al.  Criticality in Pareto Optimal Grammars? , 2020, Entropy.

[4]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[5]  Lukasz Debowski,et al.  Approximating Information Measures for Fields , 2020, Entropy.

[6]  Matthew W. Crocker,et al.  Semantic Entropy in Language Comprehension , 2019, Entropy.

[7]  Lucas Lacasa,et al.  Linguistic Laws in Speech: The Case of Catalan and Spanish , 2019, Entropy.

[8]  Fernando C Pereira Formal grammar and information theory: together again? , 2000, Philosophical Transactions of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences.

[9]  Hinrich Schütze,et al.  Book Reviews: Foundations of Statistical Natural Language Processing , 1999, CL.

[10]  Z. Harris A Theory of Language and Information: A Mathematical Approach , 1991 .

[11]  Mark Daniel Ward,et al.  Asymptotic Analysis of the kth Subword Complexity , 2020, Entropy.

[12]  Zellig S. Harris,et al.  Mathematical structures of language , 1968, Interscience tracts in pure and applied mathematics.

[13]  J. Mandler Language comprehension. , 1979, Science.

[14]  Victor Mijangos,et al.  Productivity and Predictability for Measuring Morphological Complexity , 2019, Entropy.

[15]  Martin Gerlach,et al.  A Standardized Project Gutenberg Corpus for Statistical Analysis of Natural Language and Quantitative Linguistics , 2018, Entropy.

[16]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[17]  Richard Futrell,et al.  Estimating Predictive Rate–Distortion Curves via Neural Variational Inference , 2019, Entropy.

[18]  Frederick Jelinek,et al.  Statistical methods for speech recognition , 1997 .

[19]  Claude E. Shannon,et al.  Prediction and Entropy of Printed English , 1951 .

[20]  Isabel Serra,et al.  The Brevity Law as a Scaling Law, and a Possible Origin of Zipf’s Law for Word Frequencies , 2020, Entropy.

[21]  Michael Ramscar,et al.  How the Probabilistic Structure of Grammatical Context Shapes Speech , 2020, Entropy.

[22]  Ilya Sutskever,et al.  Language Models are Unsupervised Multitask Learners , 2019 .