Subspace projection matrix completion on Grassmann manifold

In this paper, we work on the problem of subspace estimation from random downsamplings of its projection matrix. An optimization problem on the Grassmann manifold is formulated for projection matrix completion, and an iterative gradient descend line-search algorithm on the Grassmann manifold (GGDLS) is proposed to solve such optimization problem. The convergence of the proposed algorithm has theoretical guarantee, and numerical experiments verify that the required sampling number for successful recovery of a rank s projection matrix in ℝN×N with probability 1 is 2s(N - s) in the noiseless cases. Compared with some reference algorithms, in the noiseless scenario, the proposed algorithm is very time efficient, and the required sampling number is rather small for successful recovery. In the noisy scenario, the proposed GGDLS is remarkably robust against the noise both under high measurement SNR and low measurement SNR.

[1]  E. Candès,et al.  Exact low-rank matrix completion via convex optimization , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[2]  David J. Kriegman,et al.  Visual tracking using learned linear subspaces , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[3]  Prateek Jain,et al.  Low-rank matrix completion using alternating minimization , 2012, STOC '13.

[4]  P. Absil,et al.  Riemannian Geometry of Grassmann Manifolds with a View on Algorithmic Computation , 2004 .

[5]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[6]  Robert D. Nowak,et al.  Online identification and tracking of subspaces from highly incomplete information , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[7]  Zhixun Su,et al.  Linearized Alternating Direction Method with Adaptive Penalty for Low-Rank Representation , 2011, NIPS.

[8]  Pascal Frossard,et al.  Clustering on Multi-Layer Graphs via Subspace Analysis on Grassmann Manifolds , 2013, IEEE Transactions on Signal Processing.

[9]  Peter Meer,et al.  Conjugate gradient on Grassmann manifolds for robust subspace estimation , 2012, Image Vis. Comput..

[10]  René Vidal,et al.  Motion Segmentation in the Presence of Outlying, Incomplete, or Corrupted Trajectories , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Bart Vandereycken,et al.  Low-Rank Matrix Completion by Riemannian Optimization , 2013, SIAM J. Optim..

[12]  D. Kriegman,et al.  Visual tracking using learned linear subspaces , 2004, CVPR 2004.

[13]  Alan Edelman,et al.  The Geometry of Algorithms with Orthogonality Constraints , 1998, SIAM J. Matrix Anal. Appl..

[14]  Shiqian Ma,et al.  Fixed point and Bregman iterative methods for matrix rank minimization , 2009, Math. Program..

[15]  David J. Kriegman,et al.  Acquiring linear subspaces for face recognition under variable lighting , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Yuantao Gu,et al.  Robust recovery of low-rank matrices via non-convex optimization , 2014, 2014 19th International Conference on Digital Signal Processing.

[17]  Levent Tunçel,et al.  Optimization algorithms on matrix manifolds , 2009, Math. Comput..

[18]  Yui Man Lui,et al.  Advances in matrix manifolds for computer vision , 2012, Image Vis. Comput..

[19]  C. Jutten,et al.  SRF: Matrix completion based on smoothed rank function , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).