no code implementations • 8 May 2024 • Xiaomin Zhuang, Yufan Jiang, Qiaozhi He, Zhihua Wu
In this report, we present ChuXin, an entirely open-source language model with a size of 1. 6 billion parameters.
no code implementations • 28 Mar 2024 • Yufan Jiang, Qiaozhi He, Xiaomin Zhuang, Zhihua Wu
We present Code Comparison Tuning (CCT), a simple and effective tuning method for code large language models (Code LLMs) to better handle subtle code errors.
no code implementations • 7 Aug 2023 • Yufan Jiang, Qiaozhi He, Xiaomin Zhuang, Zhihua Wu, Kunpeng Wang, Wenlai Zhao, Guangwen Yang
Existing large language models have to run K times to generate a sequence of K tokens.