HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Text Summarization
Text Summarization On Duc 2004 Task 1
Text Summarization On Duc 2004 Task 1
평가 지표
ROUGE-1
ROUGE-2
ROUGE-L
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
ROUGE-1
ROUGE-2
ROUGE-L
Paper Title
Repository
DRGD
31.79
10.75
27.48
Deep Recurrent Generative Decoder for Abstractive Text Summarization
RAS-Elman
28.97
8.26
24.06
-
-
ABS
-
-
22.05
A Neural Attention Model for Abstractive Sentence Summarization
Transformer+LRPE+PE+Re-ranking+Ensemble
32.85
11.78
28.52
Positional Encoding to Control Output Sequence Length
Abs+
28.18
8.49
23.81
A Neural Attention Model for Abstractive Sentence Summarization
words-lvt5k-1sent
28.61
9.42
25.24
Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond
Transformer+WDrop
33.06
11.45
28.51
Rethinking Perturbations in Encoder-Decoders for Fast Training
EndDec+WFE
32.28
10.54
27.8
Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization
-
Transformer LM
-
17.74
-
Sample Efficient Text Summarization Using a Single Pre-Trained Transformer
Reinforced-Topic-ConvS2S
31.15
10.85
27.68
A Reinforced Topic-Aware Convolutional Sequence-to-Sequence Model for Abstractive Text Summarization
-
SEASS
29.21
9.56
25.51
Selective Encoding for Abstractive Sentence Summarization
Transformer+LRPE+PE+ALONE+Re-ranking
32.57
11.63
28.24
All Word Embeddings from One Embedding
Seq2seq + selective + MTL + ERAM
29.33
10.24
25.24
Ensure the Correctness of the Summary: Incorporate Entailment Knowledge into Abstractive Sentence Summarization
-
0 of 13 row(s) selected.
Previous
Next