no code implementations • 14 Nov 2023 • Philippe Laban, Lidiya Murakhovs'ka, Caiming Xiong, Chien-Sheng Wu
The interactive nature of Large Language Models (LLMs) theoretically allows models to refine and improve their answers, yet systematic analysis of the multi-turn behavior of LLMs remains limited.
1 code implementation • 26 Oct 2023 • Lidiya Murakhovs'ka, Philippe Laban, Tian Xie, Caiming Xiong, Chien-Sheng Wu
Making big purchases requires consumers to research or consult a salesperson to gain domain expertise.
1 code implementation • 7 Sep 2023 • Erik Nijkamp, Tian Xie, Hiroaki Hayashi, Bo Pang, Congying Xia, Chen Xing, Jesse Vig, Semih Yavuz, Philippe Laban, Ben Krause, Senthil Purushwalkam, Tong Niu, Wojciech Kryściński, Lidiya Murakhovs'ka, Prafulla Kumar Choubey, Alex Fabbri, Ye Liu, Rui Meng, Lifu Tu, Meghana Bhat, Chien-Sheng Wu, Silvio Savarese, Yingbo Zhou, Shafiq Joty, Caiming Xiong
Most open-source LLMs, on the other hand, are limited in their ability to support longer sequence lengths, which is a key requirement for many tasks that require inference over an input context.
no code implementations • 17 Feb 2023 • Philippe Laban, Chien-Sheng Wu, Lidiya Murakhovs'ka, Xiang 'Anthony' Chen, Caiming Xiong
In a second usability study, we developed and implemented a reading exercise with 95 novice news readers to measure exposure to coverage diversity.
1 code implementation • 9 Nov 2022 • Philippe Laban, Chien-Sheng Wu, Lidiya Murakhovs'ka, Xiang 'Anthony' Chen, Caiming Xiong
There are many potential benefits to news readers accessing diverse sources.
no code implementations • Findings (NAACL) 2022 • Philippe Laban, Chien-Sheng Wu, Lidiya Murakhovs'ka, Wenhao Liu, Caiming Xiong
Question generation (QGen) models are often evaluated with standardized NLG metrics that are based on n-gram overlap.
1 code implementation • Findings (NAACL) 2022 • Lidiya Murakhovs'ka, Chien-Sheng Wu, Philippe Laban, Tong Niu, Wenhao Liu, Caiming Xiong
Asking good questions is an essential ability for both human and machine intelligence.