HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Defect Detection
Defect Detection On Codexglue Devign
Defect Detection On Codexglue Devign
평가 지표
Accuracy
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Accuracy
Paper Title
Repository
PLBART + GFSA
62.96
Graph Convolutions Enrich the Self-Attention in Transformers!
CodeT5-small + GFSA
63.69
Graph Convolutions Enrich the Self-Attention in Transformers!
CodeT5-small
63.25
Graph Convolutions Enrich the Self-Attention in Transformers!
RoBERTa + GFSA
64.39
Graph Convolutions Enrich the Self-Attention in Transformers!
PLBART
62.63
Graph Convolutions Enrich the Self-Attention in Transformers!
CodeT5
65.78
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
CodeT5-base
63.51
Graph Convolutions Enrich the Self-Attention in Transformers!
CodeBERT + GFSA
64.49
Graph Convolutions Enrich the Self-Attention in Transformers!
RoBERTa
62.88
Graph Convolutions Enrich the Self-Attention in Transformers!
CodeBERT
64.31
Graph Convolutions Enrich the Self-Attention in Transformers!
CodeBERT
62.08
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
CodeT5-base + GFSA
64.75
Graph Convolutions Enrich the Self-Attention in Transformers!
0 of 12 row(s) selected.
Previous
Next