HyperAI
HyperAI
الرئيسية
المنصة
الوثائق
الأخبار
الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
شروط الخدمة
سياسة الخصوصية
العربية
HyperAI
HyperAI
Toggle Sidebar
البحث في الموقع...
⌘
K
Command Palette
Search for a command to run...
المنصة
الرئيسية
SOTA
الملخص النصي
Text Summarization On Duc 2004 Task 1
Text Summarization On Duc 2004 Task 1
المقاييس
ROUGE-1
ROUGE-2
ROUGE-L
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
ROUGE-1
ROUGE-2
ROUGE-L
Paper Title
Transformer LM
-
17.74
-
Sample Efficient Text Summarization Using a Single Pre-Trained Transformer
Transformer+LRPE+PE+Re-ranking+Ensemble
32.85
11.78
28.52
Positional Encoding to Control Output Sequence Length
Transformer+LRPE+PE+ALONE+Re-ranking
32.57
11.63
28.24
All Word Embeddings from One Embedding
Transformer+WDrop
33.06
11.45
28.51
Rethinking Perturbations in Encoder-Decoders for Fast Training
Reinforced-Topic-ConvS2S
31.15
10.85
27.68
A Reinforced Topic-Aware Convolutional Sequence-to-Sequence Model for Abstractive Text Summarization
DRGD
31.79
10.75
27.48
Deep Recurrent Generative Decoder for Abstractive Text Summarization
EndDec+WFE
32.28
10.54
27.8
Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization
Seq2seq + selective + MTL + ERAM
29.33
10.24
25.24
Ensure the Correctness of the Summary: Incorporate Entailment Knowledge into Abstractive Sentence Summarization
SEASS
29.21
9.56
25.51
Selective Encoding for Abstractive Sentence Summarization
words-lvt5k-1sent
28.61
9.42
25.24
Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond
Abs+
28.18
8.49
23.81
A Neural Attention Model for Abstractive Sentence Summarization
RAS-Elman
28.97
8.26
24.06
-
ABS
-
-
22.05
A Neural Attention Model for Abstractive Sentence Summarization
0 of 13 row(s) selected.
Previous
Next
Text Summarization On Duc 2004 Task 1 | SOTA | HyperAI