HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Discourse Parsing
Discourse Parsing On Instructional Dt Instr
Discourse Parsing On Instructional Dt Instr
평가 지표
Standard Parseval (Full)
Standard Parseval (Nuclearity)
Standard Parseval (Relation)
Standard Parseval (Span)
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Standard Parseval (Full)
Standard Parseval (Nuclearity)
Standard Parseval (Relation)
Standard Parseval (Span)
Paper Title
Repository
Top-down (XLNet)
40.2
55.2
47.0
74.3
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Bottom-up (DeBERTa)
44.4
60.0
51.4
77.8
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (BERT)
30.9
44.6
37.6
65.3
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (SpanBERT)
36.7
54.5
42.7
73.7
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (DeBERTa)
43.4
57.9
50.0
77.3
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Guz et al. (2020)
-
44.41
-
64.55
Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining
-
Bottom-up (SpanBERT)
40.5
53.8
46.0
72.9
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Top-down (RoBERTa)
41.5
56.1
48.7
75.7
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Bottom-up (BERT)
32.9
46.3
39.5
66.6
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Guz et al. (2020) (pretrained)
-
46.59
-
65.41
Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining
-
Bottom-up (XLNet)
40.7
56.4
47.4
73.6
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
Bottom-up (RoBERTa)
41.4
55.5
47.9
73.2
A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing
0 of 12 row(s) selected.
Previous
Next