HyperAI
HyperAI초신경
홈
뉴스
연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
전체 사이트 검색...
⌘
K
홈
SOTA
그래프 분류
Graph Classification On Nci1
Graph Classification On Nci1
평가 지표
Accuracy
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Accuracy
Paper Title
Repository
SAGPool_g
74.06%
Self-Attention Graph Pooling
GIUNet
80.2%
Graph isomorphism UNet
-
TokenGT
76.740±2.054
Pure Transformers are Powerful Graph Learners
SF + RFC
75.2%
A Simple Baseline Algorithm for Graph Classification
GIC
84.08%
Gaussian-Induced Convolution for Graphs
-
DDGK
68.1%
DDGK: Learning Graph Representations for Deep Divergence Graph Kernels
CIN++
85.3%
CIN++: Enhancing Topological Message Passing
SAGPool_h
67.45%
Self-Attention Graph Pooling
WL-OA
86.1%
On Valid Optimal Assignment Kernels and Applications to Graph Classification
-
FGW wl h=4 sp
86.42%
Optimal Transport for structured data with application on graphs
GFN
83.65%
Are Powerful Graph Neural Nets Necessary? A Dissection on Graph Classification
k-GNN
76.2%
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
WKPI-kmeans
87.2%
Learning metrics for persistence-based summaries and applications for graph classification
CAN
84.5%
Cell Attention Networks
ASAP
71.48
ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations
Fea2Fea-s3
74.9%
Fea2Fea: Exploring Structural Feature Correlations via Graph Neural Networks
sGIN
83.85%
Mutual Information Maximization in Graph Neural Networks
graph2vec
73.22% ± 1.81%
graph2vec: Learning Distributed Representations of Graphs
WWL
85.75%
Wasserstein Weisfeiler-Lehman Graph Kernels
DAGCN
81.68%
DAGCN: Dual Attention Graph Convolutional Networks
0 of 67 row(s) selected.
Previous
Next