Gated Graph Sequence Neural Networks

Graph-structured data appears frequently in domains including chemistry,natural language semantics, social networks, and knowledge bases. In this work,we study feature learning techniques for graph-structured inputs. Our startingpoint is previous work on Graph Neural Networks (Scarselli et al., 2009), whichwe modify to use gated recurrent units and modern optimization techniques andthen extend to output sequences. The result is a flexible and broadly usefulclass of neural network models that has favorable inductive biases relative topurely sequence-based models (e.g., LSTMs) when the problem isgraph-structured. We demonstrate the capabilities on some simple AI (bAbI) andgraph algorithm learning tasks. We then show it achieves state-of-the-artperformance on a problem from program verification, in which subgraphs need tobe matched to abstract data structures.