Link Prediction and Node Classification Based on Multitask Graph Autoencoder

The goal of network representation learning is to extract deep-level abstraction from data features that can also be viewed as a process of transforming the high-dimensional data to low-dimensional features. Learning the mapping functions between two vector spaces is an essential problem. In this paper, we propose a new similarity index based on traditional machine learning, which integrates the concepts of common neighbor, local path, and preferential attachment. Furthermore, for applying the link prediction methods to the field of node classification, we have innovatively established an architecture named multitask graph autoencoder. Specifically, in the context of structural deep network embedding, the architecture designs a framework of high-order loss function by calculating the node similarity from multiple angles so that the model can make up for the deficiency of the second-order loss function. Through the parameter fine-tuning, the high-order loss function is introduced into the optimized autoencoder. Proved by the effective experiments, the framework is generally applicable to the majority of classical similarity indexes.

[1]  Yunchuan Guo,et al.  Threat-Based Declassification and Endorsement for Mobile Computing , 2019 .

[2]  Guillaume Bouchard,et al.  Complex Embeddings for Simple Link Prediction , 2016, ICML.

[3]  Binxing Fang,et al.  Hierarchically defining Internet of Things security: From CIA to CACA , 2020, Int. J. Distributed Sens. Networks.

[4]  Linyuan Lü,et al.  Predicting missing links via local information , 2009, 0901.0553.

[5]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[6]  Jean-Loup Guillaume,et al.  Fast unfolding of communities in large networks , 2008, 0803.0476.

[7]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[8]  Bao-qun Yin,et al.  Power-law strength-degree correlation from resource-allocation dynamics on weighted networks. , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  Daniel R. Figueiredo,et al.  struc2vec: Learning Node Representations from Structural Identity , 2017, KDD.

[10]  Chengqi Zhang,et al.  Network Representation Learning: A Survey , 2017, IEEE Transactions on Big Data.

[11]  B. Snel,et al.  Comparative assessment of large-scale data sets of protein–protein interactions , 2002, Nature.

[12]  Nikhil Ketkar,et al.  Introduction to Keras , 2017 .

[13]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[14]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[15]  P. Jaccard,et al.  Etude comparative de la distribution florale dans une portion des Alpes et des Jura , 1901 .

[16]  Deyu Yuan,et al.  Particle Propagation Model for Dynamic Node Classification , 2020, IEEE Access.

[17]  A. Barabasi,et al.  Hierarchical Organization of Modularity in Metabolic Networks , 2002, Science.

[18]  Huimin Liu,et al.  PROD: A New Algorithm of DeepWalk Based On Probability , 2018, Journal of Physics: Conference Series.

[19]  M. Newman,et al.  Finding community structure in networks using the eigenvectors of matrices. , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[21]  Marc'Aurelio Ranzato,et al.  Sparse Feature Learning for Deep Belief Networks , 2007, NIPS.

[22]  H. White,et al.  “Structural Equivalence of Individuals in Social Networks” , 2022, The SAGE Encyclopedia of Research Design.

[23]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[24]  Patrick Seemann,et al.  Matrix Factorization Techniques for Recommender Systems , 2014 .

[25]  Chao Li,et al.  The QoS and privacy trade-off of adversarial deep learning: An evolutionary game approach , 2020, Comput. Secur..

[26]  Leo Katz,et al.  A new status index derived from sociometric analysis , 1953 .

[27]  Ratul Mahajan,et al.  Measuring ISP topologies with Rocketfuel , 2004, IEEE/ACM Transactions on Networking.

[28]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[29]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[30]  Linyuan Lü,et al.  Similarity index based on local paths for link prediction of complex networks. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[31]  Yehuda Koren,et al.  Matrix Factorization Techniques for Recommender Systems , 2009, Computer.

[32]  Lada A. Adamic,et al.  Friends and neighbors on the Web , 2003, Soc. Networks.

[33]  Yoshua Bengio,et al.  Greedy Layer-Wise Training of Deep Networks , 2006, NIPS.

[34]  Jon M. Kleinberg,et al.  The link-prediction problem for social networks , 2007, J. Assoc. Inf. Sci. Technol..

[35]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.