HyperAI
HyperAI
Startseite
Plattform
Dokumentation
Neuigkeiten
Forschungsarbeiten
Tutorials
Datensätze
Wiki
SOTA
LLM-Modelle
GPU-Rangliste
Veranstaltungen
Suche
Über
Nutzungsbedingungen
Datenschutzrichtlinie
Deutsch
HyperAI
HyperAI
Toggle Sidebar
Seite durchsuchen…
⌘
K
Command Palette
Search for a command to run...
Plattform
Startseite
SOTA
Relationsextraktion
Relation Extraction On Tacred
Relation Extraction On Tacred
Metriken
F1
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Columns
Modellname
F1
Paper Title
RAG4RE
86.6
Retrieval-Augmented Generation-based Relation Extraction
DeepStruct multi-task w/ finetune
76.8
DeepStruct: Pretraining of Language Models for Structure Prediction
UNiST (LARGE)
75.5
Unified Semantic Typing with Meaningful Label Inference
RE-MC
75.4
Enhancing Targeted Minority Class Prediction in Sentence-Level Relation Extraction
GenPT (T5)
75.3
Generative Prompt Tuning for Relation Classification
RECENT+SpanBERT
75.2
Relation Classification with Entity Type Restriction
SuRE (PEGASUS-large)
75.1
Summarization as Indirect Supervision for Relation Extraction
EXOBRAIN
75.0
Improving Sentence-Level Relation Extraction through Curriculum Learning
Relation Reduction
74.8
Relation Classification as Two-way Span-Prediction
RoBERTa-large-typed-marker
74.6
An Improved Baseline for Sentence-level Relation Extraction
NLI_DeBERTa
73.9
Label Verbalization and Entailment for Effective Zero- and Few-Shot Relation Extraction
Noise-robust Co-regularization + BERT-large
73.0
Learning from Noisy Labels for Entity-Centric Information Extraction
DeNERT-KG
72.4
DeNERT-KG: Named Entity and Relation Extraction Model Using DQN, Knowledge Graph, and BERT
K-ADAPTER (F+L)
72.04
K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters
TANL
71.9
Structured Prediction as Translation between Augmented Natural Languages
KEPLER
71.7
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
KnowBert-W+W
71.5
Knowledge Enhanced Contextual Word Representations
DG-SpanBERT-large
71.5
Efficient long-distance relation extraction with DG-SpanBERT
BERTEM+MTB
71.5
Matching the Blanks: Distributional Similarity for Relation Learning
RELA
71.2
Sequence Generation with Label Augmentation for Relation Extraction
0 of 40 row(s) selected.
Previous
Next
Relation Extraction On Tacred | SOTA | HyperAI