Search Results for author: William F. Shen

Found 3 papers, 0 papers with code

Worldwide Federated Training of Language Models

no code implementations23 May 2024 Alex Iacob, Lorenzo Sani, Bill Marino, Preslav Aleksandrov, William F. Shen, Nicholas Donald Lane

The reliance of language model training on massive amounts of computation and vast datasets scraped from potentially low-quality, copyrighted, or sensitive data has come into question practically, legally, and ethically.

Federated Learning Language Modelling

The Future of Large Language Model Pre-training is Federated

no code implementations17 May 2024 Lorenzo Sani, Alex Iacob, Zeyu Cao, Bill Marino, Yan Gao, Tomas Paulik, Wanru Zhao, William F. Shen, Preslav Aleksandrov, Xinchi Qiu, Nicholas D. Lane

Generative pre-trained large language models (LLMs) have demonstrated impressive performance over a wide range of tasks, thanks to the unprecedented amount of data they have been trained on.

Federated Learning Language Modelling +1

Secure Vertical Federated Learning Under Unreliable Connectivity

no code implementations26 May 2023 Xinchi Qiu, Heng Pan, Wanru Zhao, Yan Gao, Pedro P. B. Gusmao, William F. Shen, Chenyang Ma, Nicholas D. Lane

Most work in privacy-preserving federated learning (FL) has focused on horizontally partitioned datasets where clients hold the same features and train complete client-level models independently.

Privacy Preserving Vertical Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.