2 code implementations • 27 Nov 2023 • Shaohua Wu, Xudong Zhao, Shenling Wang, Jiangang Luo, Lingjun Li, Xi Chen, Bing Zhao, Wei Wang, Tong Yu, Rongguo Zhang, Jiahua Zhang, Chao Wang
In this work, we develop and release Yuan 2. 0, a series of large language models with parameters ranging from 2. 1 billion to 102. 6 billion.
no code implementations • 10 May 2022 • Bing Zhao, Jun Li, Hong Zhu
To bridge the performance gap, we propose a novel object-level self-supervised learning method, called Contrastive learning with Downstream background invariance (CoDo).
1 code implementation • NeurIPS 2021 • Qi Chen, Bing Zhao, Haidong Wang, Mingqin Li, Chuanjie Liu, Zengzhong Li, Mao Yang, Jingdong Wang
It stores the centroid points of the posting lists in the memory and the large posting lists in the disk.
1 code implementation • NeurIPS 2021 • Qi Chen, Bing Zhao, Haidong Wang, Mingqin Li, Chuanjie Liu, Zengzhong Li, Mao Yang, Jingdong Wang
It stores the centroid points of the posting lists in the memory and the large posting lists in the disk.
no code implementations • 4 Dec 2020 • Dmitrii Khokhriakov, Bogdan Karpiak, Anamul Md. Hoque, Bing Zhao, Subir Parui, Saroj P. Dash
These results of universal and isotropic spin transport on large-area inhomogeneous CVD graphene with multilayer patches and their boundaries and folds at room temperature prove its outstanding spin interconnect functionality, beneficial for the development of scalable spintronic circuits.
Mesoscale and Nanoscale Physics Materials Science Applied Physics Quantum Physics
no code implementations • 7 Sep 2015 • Katrin Kirchhoff, Bing Zhao, Wen Wang
Statistical machine translation for dialectal Arabic is characterized by a lack of data since data acquisition involves the transcription and translation of spoken language.
no code implementations • NeurIPS 2007 • Bing Zhao, Eric P. Xing
We present a novel paradigm for statistical machine translation (SMT), based on joint modeling of word alignment and the topical aspects underlying bilingual document pairs via a hidden Markov Bilingual Topic AdMixture (HM-BiTAM).