no code implementations • 25 Jun 2022 • Yuanliang Meng
This paper demonstrates a task to finetune a BART model so it can construct a sentence from an arbitrary set of words, which used to be a difficult NLP task.
no code implementations • 16 Oct 2019 • David Donahue, Yuanliang Meng, Anna Rumshisky
The first design features a sequence-to-sequence architecture with two separate NTM modules, one for each participant in the conversation.
no code implementations • 28 Aug 2019 • Yuanliang Meng, Anna Rumshisky
This paper proposes a Transformer-based model to generate equations for math word problems.
1 code implementation • COLING 2018 • Yuanliang Meng, Anna Rumshisky
We propose a triad-based neural network system that generates affinity scores between entity mentions for coreference resolution.
no code implementations • ACL 2018 • Yuanliang Meng, Anna Rumshisky
We propose a context-aware neural network model for temporal information extraction.
no code implementations • EMNLP 2017 • Yuanliang Meng, Anna Rumshisky, Alexey Romanov
In this paper, we propose to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text.