Command Palette
Search for a command to run...
A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
Arman Cohan† Franck Dernoncourt* Doo Soon Kim* Trung Bui* Seokhwan Kim* Walter Chang* Nazli Goharian†
Abstract
Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.