Characterizing Multivariate Information Flows

One of the crucial steps in scientific studies is to specify de pen-dent relationships among factors in a system of interest. Givenlittle knowledge of a system, can we characterize the underlyingdependent relationships through observation of its temporal be-haviors? In multivariate systems, there are potentially many pos-sible dependentstructures confusable with each other, and it maycause false detection of illusory dependency between unrelatedfactors. The present study proposes a new information-theoreticmeasure with consideration to such potential multivariate rela-tionships. The proposed measure, called multivariate transferentropy, is an extension of transfer entropy, a measure of tem-poral predictability. In the simulations and empirical studies, wedemonstrated that the proposedmeasure characterized the latentdependentrelationships in unknowndynamical systems more ac-curately than its alternative measure.

[1]  E. Hill Journal of Theoretical Biology , 1961, Nature.

[2]  Zaher Dawy,et al.  An approximation to the distribution of finite sample size mutual information estimates , 2005, IEEE International Conference on Communications, 2005. ICC 2005. 2005.

[3]  J. Massey CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .

[4]  Physics Letters , 1962, Nature.

[5]  Elsevier Sdol,et al.  Computer Speech & Language , 2009 .

[6]  J. A. Stewart,et al.  Nonlinear Time Series Analysis , 2015 .

[7]  M. M. Lamego,et al.  Neurointerfaces - Control Systems Technology, IEEE Transactions on , 2001 .

[8]  Geoffrey J. Gordon,et al.  Artificial Intelligence in Medicine: 17th Conference on Artificial Intelligence in Medicine, AIME 2019, Poznan, Poland, June 26–29, 2019, Proceedings , 2019, Lecture Notes in Computer Science.

[9]  S. Jørgensen,et al.  Movement rules for individual-based models of stream fish , 1999 .

[10]  小谷 正雄 日本物理学会誌及びJournal of the Physical Society of Japanの月刊について , 1955 .

[11]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[12]  Todd P. Coleman,et al.  Approximating discrete probability distributions with causal dependence trees , 2010, 2010 International Symposium On Information Theory & Its Applications.

[13]  Y. Yoshikawa,et al.  Causality detected by transfer entropy leads acquisition of joint attention , 2007, 2007 IEEE 6th International Conference on Development and Learning.

[14]  Chen Yu,et al.  Analyzing multimodal time series as dynamical systems , 2010, ICMI-MLMI '10.

[15]  W. R. Garner Uncertainty and structure as psychological concepts , 1975 .

[16]  J. Rogers Chaos , 1876 .

[17]  Neil Gershenfeld,et al.  The Future of Time Series , 1993 .

[18]  Olaf Sporns,et al.  Complex network measures of brain connectivity: Uses and interpretations , 2010, NeuroImage.

[19]  John Ashley Burgoyne,et al.  Mathematics and Computation in Music , 2013, Lecture Notes in Computer Science.

[20]  Chen Yu,et al.  Spatio-Temporal Symbolization of Multidimensional Time Series , 2010, 2010 IEEE International Conference on Data Mining Workshops.

[21]  M. Studený,et al.  The Multiinformation Function as a Tool for Measuring Stochastic Dependence , 1998, Learning in Graphical Models.

[22]  Heidar A. Malki,et al.  Control Systems Technology , 2001 .

[23]  J. Henriksson Human movement science , 2012, Acta physiologica.