HyperAI超神経
ホーム
ニュース
最新論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
ホーム
SOTA
Discourse Parsing
Discourse Parsing On Instructional Dt Instr
Discourse Parsing On Instructional Dt Instr
評価指標
Standard Parseval (Full)
Standard Parseval (Nuclearity)
Standard Parseval (Relation)
Standard Parseval (Span)
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
Standard Parseval (Full)
Standard Parseval (Nuclearity)
Standard Parseval (Relation)
Standard Parseval (Span)
Paper Title
Repository
Top-down (XLNet)
40.2
55.2
47.0
74.3
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Bottom-up (DeBERTa)
44.4
60.0
51.4
77.8
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (BERT)
30.9
44.6
37.6
65.3
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (SpanBERT)
36.7
54.5
42.7
73.7
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (DeBERTa)
43.4
57.9
50.0
77.3
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Guz et al. (2020)
-
44.41
-
64.55
Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining
-
Bottom-up (SpanBERT)
40.5
53.8
46.0
72.9
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (RoBERTa)
41.5
56.1
48.7
75.7
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Bottom-up (BERT)
32.9
46.3
39.5
66.6
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Guz et al. (2020) (pretrained)
-
46.59
-
65.41
Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining
-
Bottom-up (XLNet)
40.7
56.4
47.4
73.6
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Bottom-up (RoBERTa)
41.4
55.5
47.9
73.2
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
0 of 12 row(s) selected.
Previous
Next