Complementing Logical Reasoning with Sub-symbolic Commonsense

Neuro-symbolic integration is a current field of investigation in which symbolic approaches are combined with deep learning ones. In this work we start from simple non-relational knowledge that can be extracted from text by considering the co-occurrence of entities inside textual corpora; we show that we can easily integrate this knowledge with Logic Tensor Networks (LTNs), a neuro-symbolic model. Using LTNs it is possible to integrate axioms and facts with commonsense knowledge represented in a sub-symbolic form in one single model performing well in reasoning tasks. In spite of some current limitations, we show that results are promising.

[1]  GetoorLise,et al.  Hinge-loss Markov random fields and probabilistic soft logic , 2017 .

[2]  Debora Nozza,et al.  Towards Encoding Time in Text-Based Entity Embeddings , 2018, International Semantic Web Conference.

[3]  Tim Rocktäschel,et al.  End-to-end Differentiable Proving , 2017, NIPS.

[4]  Luc De Raedt,et al.  DeepProbLog: Neural Probabilistic Logic Programming , 2018, BNAIC/BENELEARN.

[5]  Pascal Hitzler,et al.  Reasoning over RDF Knowledge Bases using Deep Learning , 2018, ArXiv.

[6]  Gemma Boleda,et al.  Formal Distributional Semantics: Introduction to the Special Issue , 2016, CL.

[7]  Guillaume Bouchard,et al.  Complex Embeddings for Simple Link Prediction , 2016, ICML.

[8]  Danqi Chen,et al.  Reasoning With Neural Tensor Networks for Knowledge Base Completion , 2013, NIPS.

[9]  Artur S. d'Avila Garcez,et al.  Logic Tensor Networks for Semantic Image Interpretation , 2017, IJCAI.

[10]  Francis R. Bach,et al.  Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression , 2016, J. Mach. Learn. Res..

[11]  Iván V. Meza,et al.  Jointly Identifying Predicates, Arguments and Senses using Markov Logic , 2009, NAACL.

[12]  Luc De Raedt,et al.  Probabilistic (logic) programming concepts , 2015, Machine Learning.

[13]  Li Guo,et al.  Jointly Embedding Knowledge Graphs and Logical Rules , 2016, EMNLP.

[14]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[15]  Pascal Hitzler,et al.  On the Capabilities of Logic Tensor Networks for Deductive Reasoning , 2019, AAAI Spring Symposium Combining Machine Learning with Knowledge Engineering.

[16]  Guillaume Bouchard,et al.  On Approximate Reasoning Capabilities of Low-Rank Vector Spaces , 2015, AAAI Spring Symposia.

[17]  Dov M. Gabbay,et al.  Neural-Symbolic Cognitive Reasoning , 2008, Cognitive Technologies.

[18]  Luciano Serafini,et al.  Neural-Symbolic Computing: An Effective Methodology for Principled Integration of Machine Learning and Reasoning , 2019, FLAP.

[19]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[20]  Raphaël Troncy,et al.  NERD: A Framework for Unifying Named Entity Recognition and Disambiguation Extraction Tools , 2012, EACL.

[21]  Artur S. d'Avila Garcez,et al.  Learning and Reasoning with Logic Tensor Networks , 2016, AI*IA.

[22]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[23]  Alessandro Lenci,et al.  Distributional semantics in linguistic and cognitive research , 2008 .

[24]  Thomas Lukasiewicz,et al.  Deep Learning for Ontology Reasoning , 2017, ArXiv.

[25]  Elijah Chudnoff,et al.  Intuitive knowledge , 2011, Philosophical Studies.

[26]  Benjamin Kuipers,et al.  ON REPRESENTING COMMONSENSE KNOWLEDGE , 1979 .

[27]  Zellig S. Harris,et al.  Distributional Structure , 1954 .