IV-GNN : interval valued data handling using graph neural network

Graph Neural Network (GNN) is a powerful tool to perform standard machine learning on graphs. To have a Euclidean representation of every node in the Non-Euclidean graph-like data, GNN follows neighbourhood aggregation and combination of information recursively along the edges of the graph. Despite having many GNN variants in the literature, no model can deal with graphs having nodes with interval-valued features. This article proposes an Interval-Valued Graph Neural Network, a novel GNN model where, for the first time, we relax the restriction of the feature space being countable. Our model is much more general than existing models as any countable set is always a subset of the universal set R, which is uncountable. Here, to deal with interval-valued feature vectors, we propose a new aggregation scheme of intervals and show its expressive power to capture different interval structures. We validate our theoretical findings about our model for graph classification tasks by comparing its performance with those of the state-of-the-art models on several benchmark network and synthetic datasets.

[1]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[2]  R. Mesiar,et al.  Aggregation operators: properties, classes and construction methods , 2002 .

[3]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[4]  Alexander Wong,et al.  StochasticNet: Forming Deep Neural Networks via Stochastic Connectivity , 2016, IEEE Access.

[5]  Ken-ichi Kawarabayashi,et al.  Representation Learning on Graphs with Jumping Knowledge Networks , 2018, ICML.

[6]  Qingshan Liu,et al.  Finite-Time Convergent Recurrent Neural Network With a Hard-Limiting Activation Function for Constrained Optimization With Piecewise-Linear Objective Functions , 2011, IEEE Transactions on Neural Networks.

[7]  George Karypis,et al.  Comparison of descriptor spaces for chemical compound retrieval and classification , 2006, Sixth International Conference on Data Mining (ICDM'06).

[8]  Francisco de A. T. de Carvalho,et al.  Univariate and Multivariate Linear Regression Methods to Predict Interval-Valued Features , 2004, Australian Conference on Artificial Intelligence.

[9]  Bülent Yener,et al.  Graph Classification via Topological and Label Attributes , 2011 .

[10]  Ashwin Srinivasan,et al.  Statistical Evaluation of the Predictive Toxicology Challenge 2000-2001 , 2003, Bioinform..

[11]  Ping Li,et al.  A new space for comparing graphs , 2014, 2014 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM 2014).

[12]  Jie Zhou,et al.  Introduction to Graph Neural Networks , 2020, Synthesis Lectures on Artificial Intelligence and Machine Learning.

[13]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[14]  Le Song,et al.  Learning Steady-States of Iterative Algorithms over Graphs , 2018, ICML.

[15]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[16]  Mathias Niepert,et al.  Learning Convolutional Neural Networks for Graphs , 2016, ICML.

[17]  Martin Grohe,et al.  Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks , 2018, AAAI.

[18]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[19]  Chris Yakopcic,et al.  A State-of-the-Art Survey on Deep Learning Theory and Architectures , 2019, Electronics.

[20]  Kevin P. Murphy,et al.  Machine learning - a probabilistic perspective , 2012, Adaptive computation and machine learning series.

[21]  Christos Faloutsos,et al.  Graphs over time: densification laws, shrinking diameters and possible explanations , 2005, KDD '05.

[22]  Hannu Toivonen,et al.  Statistical evaluation of the predictive toxicology challenge , 2000 .

[23]  Benjamín R. C. Bedregal,et al.  Some Instances of Graded Consequence in the Context of Interval-Valued Semantics , 2015, ICLA.

[24]  Pascal Frossard,et al.  The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains , 2012, IEEE Signal Processing Magazine.

[25]  Yoshua Bengio,et al.  Deep Learning of Representations for Unsupervised and Transfer Learning , 2011, ICML Unsupervised and Transfer Learning.

[26]  Hans-Peter Kriegel,et al.  Protein function prediction via graph kernels , 2005, ISMB.

[27]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[28]  Kurt Hornik,et al.  Approximation capabilities of multilayer feedforward networks , 1991, Neural Networks.

[29]  M. Negash,et al.  Regression Analysis for Interval-Valued Data , 2018 .

[30]  Edwin Diday,et al.  Symbolic Data Analysis: Conceptual Statistics and Data Mining (Wiley Series in Computational Statistics) , 2007 .

[31]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[32]  Jure Leskovec,et al.  Graph Convolutional Neural Networks for Web-Scale Recommender Systems , 2018, KDD.

[33]  Ah Chung Tsoi,et al.  Computational Capabilities of Graph Neural Networks , 2009, IEEE Transactions on Neural Networks.

[34]  Sagrario Lantarón,et al.  Constructive Approximation of Discontinuous Functions by Neural Networks , 2008, Neural Processing Letters.

[35]  A. Debnath,et al.  Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. Correlation with molecular orbital energies and hydrophobicity. , 1991, Journal of medicinal chemistry.

[36]  Philip S. Yu,et al.  A Comprehensive Survey on Graph Neural Networks , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[37]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[38]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[39]  Yongho Jeon,et al.  A resampling approach for interval‐valued data regression , 2012, Stat. Anal. Data Min..

[40]  Pierre Vandergheynst,et al.  Geometric Deep Learning: Going beyond Euclidean data , 2016, IEEE Signal Process. Mag..

[41]  Pinar Yanardag,et al.  Deep Graph Kernels , 2015, KDD.

[42]  L. Billard,et al.  Regression Analysis for Interval-Valued Data , 2000 .

[43]  Jure Leskovec,et al.  How Powerful are Graph Neural Networks? , 2018, ICLR.