1 code implementation • Control Engineering Practice 2024 • Xiaohan Chen, Rui Yang, Yihao Xue, Baoye Song, Zidong Wang
Recent advances in intelligent rotating machinery fault diagnosis have been enabled by the availability of massive labeled training data.
no code implementations • 18 Mar 2024 • Yihao Xue, Eric Gan, Jiayi Ni, Siddharth Joshi, Baharan Mirzasoleiman
An effective technique for obtaining high-quality representations is adding a projection head on top of the encoder during training, then discarding it and using the pre-projection representations.
no code implementations • 8 Oct 2023 • Yihao Xue, Siddharth Joshi, Dang Nguyen, Baharan Mirzasoleiman
Recently, multimodal contrastive learning (MMCL) approaches, such as CLIP, have achieved a remarkable success in learning representations that are robust against distribution shift and generalize to new domains.
1 code implementation • 21 Jun 2023 • Siddharth Joshi, Yu Yang, Yihao Xue, Wenhan Yang, Baharan Mirzasoleiman
Deep neural networks often exploit non-predictive features that are spuriously correlated with class labels, leading to poor performance on groups of examples without such features.
no code implementations • 25 May 2023 • Yihao Xue, Siddharth Joshi, Eric Gan, Pin-Yu Chen, Baharan Mirzasoleiman
However, supervised CL is prone to collapsing representations of subclasses within a class by not capturing all their features, and unsupervised CL may suppress harder class-relevant features by focusing on learning easy class-irrelevant features; both significantly compromise representation quality.
no code implementations • 23 May 2023 • Yihao Xue, Ali Payani, Yu Yang, Baharan Mirzasoleiman
Pretrained machine learning models need to be adapted to distribution shifts when deployed in new target environments.
no code implementations • 17 Aug 2022 • Yihao Xue, Kyle Whitecross, Baharan Mirzasoleiman
However, the effect of label noise on the test loss curve has not been fully explored.
no code implementations • 29 Jan 2022 • Yihao Xue, Kyle Whitecross, Baharan Mirzasoleiman
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing deep networks from overfitting noisy labels.
no code implementations • 20 Dec 2020 • Yihao Xue, Chaoyue Niu, Zhenzhe Zheng, Shaojie Tang, Chengfei Lv, Fan Wu, Guihai Chen
Federated learning allows mobile clients to jointly train a global model without sending their private data to a central server.