no code implementations • 23 May 2024 • Andrew Parry, Sean MacAvaney, Debasis Ganguly
Large Language Models (LLMs) have significantly impacted many facets of natural language processing and information retrieval.
1 code implementation • 2 May 2024 • Andrew Parry, Thomas Jaenich, Sean MacAvaney, Iadh Ounis
In re-ranking, we investigate operating points of adaptive re-ranking with different first stages to find the point in graph traversal where the first stage no longer has an effect on the performance of the overall retrieval pipeline.
no code implementations • 2 May 2024 • Andrew Parry, Debasis Ganguly, Manish Chandra
With the increasing ability of large language models (LLMs), in-context learning (ICL) has evolved as a new paradigm for natural language processing (NLP), where instead of fine-tuning the parameters of an LLM specific to a downstream task with labeled examples, a small number of such examples is appended to a prompt instruction for controlling the decoder's generation process.
1 code implementation • 1 May 2024 • Andrew Parry, Sean MacAvaney, Debasis Ganguly
We demonstrate such defects by showing that non-relevant text--such as promotional content--can be easily injected into a document without adversely affecting its position in search results.
1 code implementation • 12 Mar 2024 • Andrew Parry, Maik Fröbe, Sean MacAvaney, Martin Potthast, Matthias Hagen
Modern sequence-to-sequence relevance models like monoT5 can effectively capture complex textual interactions between queries and documents through cross-encoding.