1 code implementation • 11 Apr 2023 • Hadeel Al-Negheimish, Pranava Madhyastha, Alessandra Russo
Large pre-trained language models such as BERT have been widely used as a framework for natural language understanding (NLU) tasks.
no code implementations • EMNLP 2021 • Hadeel Al-Negheimish, Pranava Madhyastha, Alessandra Russo
The current standings of these models in the DROP leaderboard, over standard metrics, suggest that the models have achieved near-human performance.
no code implementations • EACL 2021 • Hadeel Al-Negheimish, Pranava Madhyastha, Alessandra Russo
Reasoning about information from multiple parts of a passage to derive an answer is an open challenge for reading-comprehension models.