no code implementations • 12 Dec 2022 • Miguel Rios, Ameen Abu-Hanna
We demonstrate that the sparse mechanism approach outperforms the dense one for the local self-attention in terms of predictive performance with a publicly available dataset, and puts higher attention to prespecified relevant directive words.
1 code implementation • 5 Dec 2022 • Miguel Rios, Raluca-Maria Chereji, Alina Secara, Dragos Ciobanu
Multilingual Neural Machine Translation (MNMT) models leverage many language pairs during training to improve translation quality for low-resource languages by transferring knowledge from high-resource languages.
1 code implementation • 1 Dec 2022 • Miguel Rios, Ameen Abu-Hanna
This often leads to drop in the performance of neural models for prospective patients, especially in terms of their calibration.
1 code implementation • ACL 2019 • Iacer Calixto, Miguel Rios, Wilker Aziz
In this work, we propose to model the interaction between visual and textual features for multi-modal neural machine translation (MMT) through a latent variable model.
Ranked #9 on Multimodal Machine Translation on Multi30K
1 code implementation • NAACL 2018 • Miguel Rios, Wilker Aziz, Khalil Sima'an
This work exploits translation data as a source of semantically relevant learning signal for models of word representation.
no code implementations • WS 2017 • Jan-Thorsten Peter, Hermann Ney, Ond{\v{r}}ej Bojar, Ngoc-Quan Pham, Jan Niehues, Alex Waibel, Franck Burlot, Fran{\c{c}}ois Yvon, M{\=a}rcis Pinnis, Valters {\v{S}}ics, Jasmijn Bastings, Miguel Rios, Wilker Aziz, Philip Williams, Fr{\'e}d{\'e}ric Blain, Lucia Specia