Search Results for author: Tian Wen

Found 2 papers, 1 papers with code

Logits Poisoning Attack in Federated Distillation

no code implementations8 Jan 2024 Yuhan Tang, Zhiyuan Wu, Bo Gao, Tian Wen, Yuwei Wang, Sheng Sun

Federated Distillation (FD) is a novel and promising distributed machine learning paradigm, where knowledge distillation is leveraged to facilitate a more efficient and flexible cross-device knowledge transfer in federated learning.

Federated Learning Knowledge Distillation +1

Improving Communication Efficiency of Federated Distillation via Accumulating Local Updates

1 code implementation7 Dec 2023 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Tian Wen, Wen Wang

ALU drastically decreases the frequency of communication in federated distillation, thereby significantly reducing the communication overhead during the training process.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.