no code implementations • NAACL 2022 • Rahul Sharma, Anil Ramakrishna, Ansel MacLaughlin, Anna Rumshisky, Jimit Majmudar, Clement Chung, Salman Avestimehr, Rahul Gupta
Federated learning (FL) has recently emerged as a method for training ML models on edge devices using sensitive user data and is seen as a way to mitigate concerns over data privacy.
1 code implementation • Joint Conference on Lexical and Computational Semantics 2021 • Ansel MacLaughlin, Shaobin Xu, David A. Smith
In extensive experiments, we study the relative performance of four classes of neural and bag-of-words models on three LTRD tasks {--} detecting plagiarism, modeling journalists{'} use of press releases, and identifying scientists{'} citation of earlier papers.
no code implementations • EACL 2021 • Ansel MacLaughlin, David Smith
We explore the task of quotability identification, in which, given a document, we aim to identify which of its passages are the most quotable, i. e. the most likely to be directly quoted by later derived documents.
no code implementations • EMNLP (insights) 2020 • Ansel MacLaughlin, Jwala Dhamala, Anoop Kumar, Sriram Venkatapathy, Ragav Venkatesan, Rahul Gupta
Neural Architecture Search (NAS) methods, which automatically learn entire neural model or individual neural cell architectures, have recently achieved competitive or state-of-the-art (SOTA) performance on variety of natural language processing and computer vision tasks, including language modeling, natural language inference, and image classification.
no code implementations • 17 May 2020 • Ansel MacLaughlin, Tao Chen, Burcu Karagol Ayan, Dan Roth
Our experiments confirm the strong performance of BERT-based methods on this task, which outperform bag-of-words and neural ranking baselines by more than 30% relative across all ranking metrics.