HyperAIHyperAI
11 days ago

Can Classic GNNs Be Strong Baselines for Graph-level Tasks? Simple Architectures Meet Excellence

Luo, Yuankai, Shi, Lei, Wu, Xiao-Ming
Can Classic GNNs Be Strong Baselines for Graph-level Tasks? Simple
  Architectures Meet Excellence
Abstract

Message-passing Graph Neural Networks (GNNs) are often criticized for theirlimited expressiveness, issues like over-smoothing and over-squashing, andchallenges in capturing long-range dependencies. Conversely, Graph Transformers(GTs) are regarded as superior due to their employment of global attentionmechanisms, which potentially mitigate these challenges. Literature frequentlysuggests that GTs outperform GNNs in graph-level tasks, especially for graphclassification and regression on small molecular graphs. In this study, weexplore the untapped potential of GNNs through an enhanced framework, GNN+,which integrates six widely used techniques: edge feature integration,normalization, dropout, residual connections, feed-forward networks, andpositional encoding, to effectively tackle graph-level tasks. We conduct asystematic re-evaluation of three classic GNNs (GCN, GIN, and GatedGCN)enhanced by the GNN+ framework across 14 well-known graph-level datasets. Ourresults reveal that, contrary to prevailing beliefs, these classic GNNsconsistently match or surpass the performance of GTs, securing top-threerankings across all datasets and achieving first place in eight. Furthermore,they demonstrate greater efficiency, running several times faster than GTs onmany datasets. This highlights the potential of simple GNN architectures,challenging the notion that complex mechanisms in GTs are essential forsuperior graph-level performance. Our source code is available athttps://github.com/LUOyk1999/GNNPlus.

Can Classic GNNs Be Strong Baselines for Graph-level Tasks? Simple Architectures Meet Excellence | Latest Papers | HyperAI