Information Theoretic Bounds for Tensor Rank Minimization over Finite Fields

We consider the problem of noiseless and noisy low- rank tensor completion from a set of random linear measurements. In our derivations, we assume that the entries of the tensor belong to a finite field of arbitrary size and that reconstruction is based on a rank minimization framework. The derived results show that the smallest number of measurements needed for exact reconstruction is upper bounded by the product of the rank, the order, and the dimension of a cubic tensor. Furthermore, this condition is also sufficient for unique minimization. Similar bounds hold for the noisy rank minimization scenario, except for a scaling function that depends on the channel error probability.