HyperAI
HyperAI
Startseite
Neuigkeiten
Forschungsarbeiten
Tutorials
Datensätze
Wiki
SOTA
LLM-Modelle
GPU-Rangliste
Veranstaltungen
Suche
Über
Deutsch
HyperAI
HyperAI
Toggle sidebar
Seite durchsuchen…
⌘
K
Seite durchsuchen…
⌘
K
Startseite
SOTA
Dialogrelationsextraktion
Dialog Relation Extraction On Dialogre
Dialog Relation Extraction On Dialogre
Metriken
F1 (v1)
F1c (v1)
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Columns
Modellname
F1 (v1)
F1c (v1)
Paper Title
Repository
BERTS
61.2
55.4
Dialogue-Based Relation Extraction
BERT+SIEF
61.8
58.4
Document-Level Relation Extraction with Sentences Importance Estimation and Focusing
Dual
67.3
61.4
Semantic Representation for Dialogue Modeling
HiDialog
-
-
Hierarchical Dialogue Understanding with Special Tokens and Turn-level Attention
GRASP_Large
75.1
66.7
GRASP: Guiding model with RelAtional Semantics using Prompt for Dialogue Relation Extraction
TUCORE-GCN_RoBERTa
-
-
Graph Based Network with Contextualized Representations of Turns in Dialogue
GRASP_Base
69.2
62.4
GRASP: Guiding model with RelAtional Semantics using Prompt for Dialogue Relation Extraction
SocAoG
-
-
SocAoG: Incremental Graph Parsing for Social Relation Inference in Dialogues
-
SimpleRE
66.3
-
An Embarrassingly Simple Model for Dialogue Relation Extraction
GDPNet
64.9
60.1
GDPNet: Refining Latent Multi-View Graph for Relation Extraction
Joint_RoBERTa
-
-
D-REX: Dialogue Relation Extraction with Explanations
BiLSTM
48.6
45
Dialogue-Based Relation Extraction
DHGAT
56.1
50.7
Dialogue Relation Extraction with Document-level Heterogeneous Graph Attention Networks
KnowPrompt
66.0
-
KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
DiaRE-D2G
-
-
Global inference with explicit syntactic and discourse structures for dialogue-level relation extraction
-
D-REX_RoBERTa
-
-
D-REX: Dialogue Relation Extraction with Explanations
TUCORE-GCN_BERT
-
-
Graph Based Network with Contextualized Representations of Turns in Dialogue
0 of 17 row(s) selected.
Previous
Next