Sentence Compression
22 papers with code • 1 benchmarks • 2 datasets
Sentence Compression is the task of reducing the length of text by removing non-essential content while preserving important facts and grammaticality.
Most implemented papers
Unsupervised Abstractive Meeting Summarization with Multi-Sentence Compression and Budgeted Submodular Maximization
We introduce a novel graph-based framework for abstractive meeting speech summarization that is fully unsupervised and does not rely on any annotations.
Globally Normalized Transition-Based Neural Networks
Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models.
Sentence Simplification with Deep Reinforcement Learning
Sentence simplification aims to make sentences easier to read and understand.
Combining Graph Degeneracy and Submodularity for Unsupervised Extractive Summarization
We present a fully unsupervised, extractive text summarization system that leverages a submodularity framework introduced by past research.
Learning How to Simplify From Explicit Labeling of Complex-Simplified Text Pairs
Current research in text simplification has been hampered by two central problems: (i) the small amount of high-quality parallel simplification data available, and (ii) the lack of explicit annotations of simplification operations, such as deletions or substitutions, on existing data.
Sequence-to-sequence Models for Cache Transition Systems
In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.
Unsupervised Semantic Abstractive Summarization
Automatic abstractive summary generation remains a significant open problem for natural language processing.
Unsupervised Sentence Compression using Denoising Auto-Encoders
In sentence compression, the task of shortening sentences while retaining the original meaning, models tend to be trained on large corpora containing pairs of verbose and compressed sentences.
Sentence Compression for Arbitrary Languages via Multilingual Pivoting
In this paper we advocate the use of bilingual corpora which are abundantly available for training sentence compression models.