HyperAI超神经
首页
资讯
最新论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
Text Summarization
Text Summarization On Duc 2004 Task 1
Text Summarization On Duc 2004 Task 1
评估指标
ROUGE-1
ROUGE-2
ROUGE-L
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
ROUGE-1
ROUGE-2
ROUGE-L
Paper Title
Repository
DRGD
31.79
10.75
27.48
Deep Recurrent Generative Decoder for Abstractive Text Summarization
RAS-Elman
28.97
8.26
24.06
-
-
ABS
-
-
22.05
A Neural Attention Model for Abstractive Sentence Summarization
Transformer+LRPE+PE+Re-ranking+Ensemble
32.85
11.78
28.52
Positional Encoding to Control Output Sequence Length
Abs+
28.18
8.49
23.81
A Neural Attention Model for Abstractive Sentence Summarization
words-lvt5k-1sent
28.61
9.42
25.24
Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond
Transformer+WDrop
33.06
11.45
28.51
Rethinking Perturbations in Encoder-Decoders for Fast Training
EndDec+WFE
32.28
10.54
27.8
Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization
-
Transformer LM
-
17.74
-
Sample Efficient Text Summarization Using a Single Pre-Trained Transformer
Reinforced-Topic-ConvS2S
31.15
10.85
27.68
A Reinforced Topic-Aware Convolutional Sequence-to-Sequence Model for Abstractive Text Summarization
-
SEASS
29.21
9.56
25.51
Selective Encoding for Abstractive Sentence Summarization
Transformer+LRPE+PE+ALONE+Re-ranking
32.57
11.63
28.24
All Word Embeddings from One Embedding
Seq2seq + selective + MTL + ERAM
29.33
10.24
25.24
Ensure the Correctness of the Summary: Incorporate Entailment Knowledge into Abstractive Sentence Summarization
-
0 of 13 row(s) selected.
Previous
Next