no code implementations • 13 Oct 2022 • Aashish Arora, Harshitha Malireddi, Daniel Bauer, Asad Sayeed, Yuval Marton
Unlike previous work, our model does not require pre-training or fine-tuning on additional tasks, beyond using off-the-shelf (static or contextual) embeddings and supervision.