1 code implementation • 24 May 2024 • Kunjal Panchal, Nisarg Parikh, Sunav Choudhary, Lijun Zhang, Yuriy Brun, Hui Guan
Empirically, Spry reduces the memory footprint during training by 1. 4-7. 1$\times$ in contrast to backpropagation, while reaching comparable accuracy, across a wide range of language tasks, models, and FL settings.
no code implementations • 28 Nov 2022 • Kunjal Panchal, Sunav Choudhary, Nisarg Parikh, Lijun Zhang, Hui Guan
Current approaches to personalization in FL are at a coarse granularity, i. e. all the input instances of a client use the same personalized model.