Alternating-direction-method of Multipliers-Based Symmetric Nonnegative Latent Factor Analysis for Large-scale Undirected Weighted Networks
暂无分享,去创建一个
Large-scale undirected weighted networks are frequently encountered in real applications. They can be described by a Symmetric, High-Dimensional and Sparse (SHiDS) matrix, whose sparse and symmetric data should be addressed with care. However, existing models either fail to handle its sparsity effectively, or fail to correctly describe its symmetry. For addressing these issues, this study proposes an Alternating-direction-method-of-multipliers-based Symmetric Nonnegative Latent Factor Analysis (ASNL) model. Its main idea is three-fold: 1) introducing an equality constraint into a data density-oriented learning objective for a flexible and effective learning process; 2) confining an augmented term to be data density-oriented to enhance generalization the model's ability; and 3) utilizing the principle of alternating-direction-method of multipliers to divide a complex optimization task into multiple simple subtasks, each of which is solved based on the results of previously solved ones. Empirical studies on two SHiDS matrices demonstrate that ASNL obtains higher prediction accuracy for their missing data than state-of-the-art models with competitive computational efficiency.