A Truncated Newton Method-Based Symmetric Non-negative Latent Factor Model for Large-scale Undirected Networks Representation*

Large-scale undirected networks representation can be formulated into a non-convex optimization problem, which can be efficiently addressed by a symmetric non-negative latent factor analysis (SNLFA)-based approach. However, an SNLFA model commonly adopts a first-order optimization algorithm that cannot well handle its non-convex learning objective, thereby resulting in inaccurate representation to a target network. On the other hand, higher-order learning algorithms like a Newton-type one is expected to solve a non-convex optimization problem well, but its computation efficiency and scalability are greatly limited due to its direct manipulation of the Hessian matrix that can be huge in an SNLFA model. To address this issue, this paper proposes a Truncated Newton-method-based Symmetric Non-negative Latent Factor Analysis (TNS) model for utilizing second-order information in the latent factor analysis process without manipulating the Hessian matrix, thereby achieving high computational efficiency and scalability. Empirical studies indicate that TNS outperforms state-of-the-art models in prediction accuracy with an affordable computational burden.