1 code implementation • 20 Dec 2023 • Shiu-hong Kao, Jierun Chen, S. H. Gary Chan
Knowledge distillation (KD) has been recognized as an effective tool to compress and accelerate models.
Knowledge Distillation