Estimation of a Low-Rank Probability-Tensor from Sample Sub-Tensors via Joint Factorization Minimizing the Kullback-Leibler Divergence

Recently there has been a growing interest in the estimation of the Probability Mass Function (PMF) of discrete random vectors (RVs) from partial observations thereof (namely when observed realizations of the RV are limited to random subsets of its elements). It was shown that under a low-rank assumption on the PMF tensor (and some additional mild conditions), the full tensor can be recovered, e.g., by applying an approximate coupled factorization to empirical estimates of all joint PMFs of subgroups of fixed cardinality larger than two (e.g., triplets). The coupled factorization is based on a Least Squares (LS) fit to the empirically estimated lower-order sub-tensors. In this work we take a different approach by trying to fit the coupled factorization to estimated sub-tensors in the sense of minimizing the Kullback-Leibler divergence (KLD) between the estimated and inferred tensors. We explain why the KLD-based fitting is better-suited than LS-based fitting for the problem of PMF estimation, propose an associated minimization approach and demonstrate some advantages over LS-based fitting in this context using simulation results.