Searching to Sparsify Tensor Decomposition for N-ary Relational Data

Tensor, an extension of the vector and matrix to the multi-dimensional case, is a natural way to describe the N-ary relational data. Recently, tensor decomposition methods have been introduced into N-ary relational data and become state-of-the-art on embedding learning. However, the performance of existing tensor decomposition methods is not as good as desired. First, they suffer from the data-sparsity issue since they can only learn from the N-ary relational data with a specific arity, i.e., parts of common N-ary relational data. Besides, they are neither effective nor efficient enough to be trained due to the over-parameterization problem. In this paper, we propose a novel method, i.e., S2S, for effectively and efficiently learning from the N-ary relational data. Specifically, we propose a new tensor decomposition framework, which allows embedding sharing to learn from facts with mixed arity. Since the core tensors may still suffer from the over-parameterization, we propose to reduce parameters by sparsifying the core tensors while retaining their expressive power using neural architecture search (NAS) techniques, which can search for data-dependent architectures. As a result, the proposed S2S not only guarantees to be expressive but also efficiently learns from mixed arity. Finally, empirical results have demonstrated that S2S is efficient to train and achieves state-of-the-art performance. 1

[1]  Jens Lehmann,et al.  DBpedia: A Nucleus for a Web of Open Data , 2007, ISWC/ASWC.

[2]  Gerhard Weikum,et al.  WWW 2007 / Track: Semantic Web Session: Ontologies ABSTRACT YAGO: A Core of Semantic Knowledge , 2022 .

[3]  F. L. Hitchcock The Expression of a Tensor or a Polyadic as a Sum of Products , 1927 .

[4]  Jianfeng Gao,et al.  Embedding Entities and Relations for Learning and Inference in Knowledge Bases , 2014, ICLR.

[5]  Timothy M. Hospedales,et al.  TuckER: Tensor Factorization for Knowledge Graph Completion , 2019, EMNLP.

[6]  Lars Kotthoff,et al.  Automated Machine Learning: Methods, Systems, Challenges , 2019, The Springer Series on Challenges in Machine Learning.

[7]  Danqi Chen,et al.  Observed versus latent features for knowledge base and text inference , 2015, CVSC.

[8]  Nicholas Jing Yuan,et al.  Collaborative Knowledge Base Embedding for Recommender Systems , 2016, KDD.

[9]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[10]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[11]  Yixin Cao,et al.  Unifying Knowledge Graph Learning and Recommendation: Towards a Better Understanding of User Preferences , 2019, WWW.

[12]  Yuanzhuo Wang,et al.  Link Prediction on N-ary Relational Data , 2019, WWW.

[13]  Max Welling,et al.  Learning Sparse Neural Networks through L0 Regularization , 2017, ICLR.

[14]  Jiafeng Guo,et al.  NeuInfer: Knowledge Inference on N-ary Facts , 2020, ACL.

[15]  Yang Yuan,et al.  Expanding Holographic Embeddings for Knowledge Completion , 2018, NeurIPS.

[16]  Isabelle Guyon,et al.  Taking Human out of Learning Applications: A Survey on Automated Machine Learning , 2018, 1810.13306.

[17]  Nicolas Usunier,et al.  Canonical Tensor Decomposition for Knowledge Base Completion , 2018, ICML.

[18]  Jian-Yun Nie,et al.  RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space , 2018, ICLR.

[19]  Evgeniy Gabrilovich,et al.  A Review of Relational Machine Learning for Knowledge Graphs , 2015, Proceedings of the IEEE.

[20]  Praveen Paritosh,et al.  Freebase: a collaboratively created graph database for structuring human knowledge , 2008, SIGMOD Conference.

[21]  L. Getoor,et al.  Sparsity and Noise: Where Knowledge Graph Embeddings Fall Short , 2017, EMNLP.

[22]  Yiming Yang,et al.  DARTS: Differentiable Architecture Search , 2018, ICLR.

[23]  Yiming Yang,et al.  Analogical Inference for Multi-relational Embeddings , 2017, ICML.

[24]  Zhen Wang,et al.  Knowledge Graph Embedding by Translating on Hyperplanes , 2014, AAAI.

[25]  Paolo Rosso,et al.  Beyond Triplets: Hyper-Relational Knowledge Graph Embedding for Link Prediction , 2020, WWW.

[26]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[27]  Jason Weston,et al.  Irreflexive and Hierarchical Relations as Translations , 2013, ArXiv.

[28]  Yong Li,et al.  Generalizing Tensor Decomposition for N-ary Relational Knowledge Bases , 2020, WWW.

[29]  David Vázquez,et al.  Knowledge Hypergraphs: Prediction Beyond Binary Relations , 2019, IJCAI.

[30]  Timothy M. Hospedales,et al.  Hypernetwork Knowledge Graph Embeddings , 2018, ICANN.

[31]  David D. Cox,et al.  Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures , 2013, ICML.

[32]  Richong Zhang,et al.  Scalable Instance Reconstruction in Knowledge Bases via Relatedness Affiliated Embedding , 2018, WWW.

[33]  Lina Yao,et al.  Quaternion Knowledge Graph Embeddings , 2019, NeurIPS.

[34]  Hui Li,et al.  On Multi-Relational Link Prediction with Bilinear Models , 2017, AAAI.

[35]  Lei Chen,et al.  AutoSF: Searching Scoring Functions for Knowledge Graph Embedding , 2019, 2020 IEEE 36th International Conference on Data Engineering (ICDE).

[36]  Seyed Mehran Kazemi,et al.  SimplE Embedding for Link Prediction in Knowledge Graphs , 2018, NeurIPS.

[37]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[38]  Frank Hutter,et al.  Neural Architecture Search: A Survey , 2018, J. Mach. Learn. Res..

[39]  Youhei Akimoto,et al.  Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search , 2019, ICML.

[40]  James P. Callan,et al.  Explicit Semantic Ranking for Academic Search via Knowledge Graph Embedding , 2017, WWW.

[41]  Jens Lehmann,et al.  Neural Network-based Question Answering over Knowledge Graphs on Word and Character Level , 2017, WWW.

[42]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[43]  Pasquale Minervini,et al.  Convolutional 2D Knowledge Graph Embeddings , 2017, AAAI.

[44]  Zhanxing Zhu,et al.  Efficient Neural Architecture Search via Proximal Iterations , 2020, AAAI.

[45]  Gerhard Weikum,et al.  HighLife: Higher-arity Fact Harvesting , 2018, WWW.

[46]  Naganand Yadati,et al.  Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs , 2020, NeurIPS.

[47]  Liang Lin,et al.  SNAS: Stochastic Neural Architecture Search , 2018, ICLR.

[48]  Jianxin Li,et al.  On the Representation and Embedding of Knowledge Bases beyond Binary Relations , 2016, IJCAI.

[49]  Zhendong Mao,et al.  Knowledge Graph Embedding: A Survey of Approaches and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[50]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[51]  Guillaume Bouchard,et al.  Knowledge Graph Completion via Complex Tensor Factorization , 2017, J. Mach. Learn. Res..