no code implementations • 8 Jan 2024 • Yuhan Tang, Zhiyuan Wu, Bo Gao, Tian Wen, Yuwei Wang, Sheng Sun
Federated Distillation (FD) is a novel and promising distributed machine learning paradigm, where knowledge distillation is leveraged to facilitate a more efficient and flexible cross-device knowledge transfer in federated learning.
1 code implementation • 7 Dec 2023 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Tian Wen, Wen Wang
ALU drastically decreases the frequency of communication in federated distillation, thereby significantly reducing the communication overhead during the training process.