Backdoor Defense for Data-Free Distillation with Poisoned Teachers
1 papers with code • 0 benchmarks • 0 datasets
Defend against backdoor attack from poisoned teachers.
Benchmarks
These leaderboards are used to track progress in Backdoor Defense for Data-Free Distillation with Poisoned Teachers
No evaluation results yet. Help compare methods by
submitting
evaluation metrics.
Most implemented papers
Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
Data-free knowledge distillation (KD) helps transfer knowledge from a pre-trained model (known as the teacher model) to a smaller model (known as the student model) without access to the original training data used for training the teacher model.