Exploiting tensor rank-one decomposition in probabilistic inference

We propose a new additive decomposition of probability tables – tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one- dimensional tables. Entries in tables are allowed to be any real number, i. e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that tensor rank-one decomposition can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.

[1]  Finn V. Jensen,et al.  Bayesian Networks and Decision Graphs , 2001, Statistics for Engineering and Information Science.

[3]  David Heckerman,et al.  A New Look at Causal Independence , 1994, UAI.

[4]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[5]  S. F. Galán,et al.  An e-cient factorization for the noisy MAX ⁄ , 2007 .

[6]  Johan Håstad,et al.  Tensor Rank is NP-Complete , 1989, ICALP.

[7]  David Heckerman,et al.  A Tractable Inference Algorithm for Diagnosing Multiple Diseases , 2013, UAI.

[8]  Jirí Vomlel,et al.  Exploiting Functional Dependence in Bayesian Network Inference , 2002, UAI.

[9]  Bruce D'Ambrosio,et al.  Multiplicative Factorization of Noisy-Max , 1999, UAI.

[10]  L. Lathauwer,et al.  On the Best Rank-1 and Rank-( , 2004 .

[11]  Michael I. Jordan Graphical Models , 2003 .

[12]  E. Polak,et al.  Computational methods in optimization : a unified approach , 1972 .

[13]  Adnan Darwiche,et al.  A differential approach to inference in Bayesian networks , 2000, JACM.

[14]  Steen Andreassen,et al.  A munin network for the median nerve - a case study on loops , 1989, Appl. Artif. Intell..

[15]  Nevin Lianwen Zhang,et al.  Exploiting Causal Independence in Bayesian Network Inference , 1996, J. Artif. Intell. Res..

[16]  L. Lathauwer,et al.  From Matrix to Tensor : Multilinear Algebra and Signal Processing , 1996 .

[17]  Gene H. Golub,et al.  Matrix Computations, Third Edition , 1996 .

[18]  F. Díez,et al.  An efficient factorization for the noisy MAX ∗ , 2007 .

[19]  Adnan Darwiche,et al.  Compiling Bayesian Networks with Local Structure , 2005, IJCAI.

[20]  Gene H. Golub,et al.  Matrix computations , 1983 .

[21]  David Heckerman,et al.  Causal Independence for Knowledge Acquisition and Inference , 1993, UAI.

[22]  VandewalleJoos,et al.  On the Best Rank-1 and Rank-(R1,R2,. . .,RN) Approximation of Higher-Order Tensors , 2000 .