The Concordia NLG Surface Realizer at SRST 2019
This paper presents the model we developed for the shallow track of the 2019 NLG Surface Realization Shared Task. The model reconstructs sentences whose word order and word inflections were removed. We divided the problem into two sub-problems: reordering and inflecting. For the purpose of reordering, we used a pointer network integrated with a transformer model as its encoder-decoder modules. In order to generate the inflected forms of tokens, a Feed Forward Neural Network was employed.
PDF AbstractTasks
Datasets
Add Datasets
introduced or used in this paper
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Absolute Position Encodings •
Adam •
Additive Attention •
BPE •
Dense Connections •
Dropout •
Label Smoothing •
Layer Normalization •
Linear Layer •
LSTM •
Multi-Head Attention •
Pointer Network •
Position-Wise Feed-Forward Layer •
ReLU •
Residual Connection •
Scaled Dot-Product Attention •
Sigmoid Activation •
Softmax •
Tanh Activation •
Transformer