1 code implementation • 17 May 2024 • Yixin Ji, Yang Xiang, Juntao Li, Wei Chen, Zhongyi Liu, Kehai Chen, Min Zhang
To address the challenges of low-rank compression in LLMs, we conduct empirical research on the low-rank characteristics of large models.
1 code implementation • 9 May 2024 • Dan Qiao, Yi Su, Pinzheng Wang, Jing Ye, Wenjing Xie, Yuechi Zhou, Yuyang Ding, Zecheng Tang, Jikai Wang, Yixin Ji, Yue Wang, Pei Guo, Zechen Sun, Zikang Zhang, Juntao Li, Pingfu Chao, Wenliang Chen, Guohong Fu, Guodong Zhou, Qiaoming Zhu, Min Zhang
Large Language Models (LLMs) have played an important role in many fields due to their powerful capabilities. However, their massive number of parameters leads to high deployment requirements and incurs significant inference costs, which impedes their practical applications.
no code implementations • 25 Apr 2023 • Yi Su, Yixin Ji, Juntao Li, Hai Ye, Min Zhang
Accordingly, in this paper, we propose perturbation consistency learning (PCL), a simple test-time adaptation method to promote the model to make stable predictions for samples with distribution shifts.
no code implementations • CVPR 2020 • Jingwen Ye, Yixin Ji, Xinchao Wang, Xin Gao, Mingli Song
Then a dual generator is trained by taking the output from the former generator as input.
1 code implementation • 28 May 2019 • Jingwen Ye, Xinchao Wang, Yixin Ji, Kairi Ou, Mingli Song
Many well-trained Convolutional Neural Network(CNN) models have now been released online by developers for the sake of effortless reproducing.
1 code implementation • CVPR 2019 • Jingwen Ye, Yixin Ji, Xinchao Wang, Kairi Ou, Dapeng Tao, Mingli Song
In this paper, we investigate a novel deep-model reusing task.