1 code implementation • NeurIPS 2023 • Guangyan Chen, Meiling Wang, Yi Yang, Kai Yu, Li Yuan, Yufeng Yue
Large language models (LLMs) based on the generative pre-training transformer (GPT) have demonstrated remarkable effectiveness across a diverse range of downstream tasks.
Ranked #3 on 3D Point Cloud Classification on ScanObjectNN (using extra training data)
1 code implementation • ICCV 2023 • Guangyan Chen, Meiling Wang, Li Yuan, Yi Yang, Yufeng Yue
In this paper, a critical observation is made that the invisible parts of each point cloud can be directly utilized as inherent masks, and the aligned point cloud pair can be regarded as the reconstruction target.
no code implementations • 3 Dec 2022 • Tianwei Lin, Honglin Lin, Fu Li, Dongliang He, Wenhao Wu, Meiling Wang, Xin Li, Yong liu
Then, in \textbf{AdaCM}, we adopt a CNN encoder to adaptively predict all parameters for the ColorMLP conditioned on each input content and style image pair.
1 code implementation • 17 Dec 2021 • Guangyan Chen, Meiling Wang, Yufeng Yue, Qingxiang Zhang, Li Yuan
Recent Transformer-based methods have achieved advanced performance in point cloud registration by utilizing advantages of the Transformer in order-invariance and modeling dependency to aggregate information.
3 code implementations • ICCV 2021 • Songhua Liu, Tianwei Lin, Dongliang He, Fu Li, Meiling Wang, Xin Li, Zhengxing Sun, Qian Li, Errui Ding
Finally, the content feature is normalized so that they demonstrate the same local feature statistics as the calculated per-point weighted style feature statistics.
no code implementations • WS 2019 • Meiling Wang, Min Xiao, Changliang Li, Yu Guo, Zhixin Zhao, Xiaonan Liu
Chinese idioms (Cheng Yu) have seen five thousand years{'} history and culture of China, meanwhile they contain large number of scientific achievement of ancient China.