Search Results for author: S. H. Gary Chan

Found 1 papers, 1 papers with code

StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation

1 code implementation20 Dec 2023 Shiu-hong Kao, Jierun Chen, S. H. Gary Chan

Knowledge distillation (KD) has been recognized as an effective tool to compress and accelerate models.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.