1 code implementation • 26 Jul 2022 • Xingqun Qi, Zhuojie Wu, Min Ren, Muyi Sun, Caifeng Shan, Zhenan Sun
Considering the domain-invariant representative vectors in MSAN, we propose two generalizable knowledge distillation schemes for cross-domain distillation, Dual Contrastive Graph Distillation (DCGD) and Domain-Invariant Cross Distillation (DICD).
no code implementations • 6 Apr 2022 • Zhuojie Wu, Xingqun Qi, Zijian Wang, Wanting Zhou, Kun Yuan, Muyi Sun, Zhenan Sun
Furthermore, to better improve the inter-coordination between the corrupted and non-corrupted regions and enhance the intra-coordination in corrupted regions, we design InCo2 Loss, a pair of similarity based losses to constrain the feature consistency.
no code implementations • 26 Aug 2021 • Zhuojie Wu, Zijian Wang, Wenxuan Zou, Fan Ji, Hao Dang, Wanting Zhou, Muyi Sun
In the three-dimensional feature learning path, we design a novel Adaptive Pooling Module (APM) and propose a new Quadruple Attention Module (QAM).