1 code implementation • 1 Apr 2024 • Yunsong Wang, Hanlin Chen, Gim Hee Lee
Recent advancements in vision-language foundation models have significantly enhanced open-vocabulary 3D scene understanding.
Open Vocabulary Semantic Segmentation Scene Understanding +1
no code implementations • 1 Dec 2023 • Hanlin Chen, Chen Li, Gim Hee Lee
In this work, we propose a neural implicit surface reconstruction pipeline with guidance from 3D Gaussian Splatting to recover highly detailed surfaces.
no code implementations • 22 Jan 2023 • Hanlin Chen, Renyuan Luo, Yiheng Feng
Navigating CAVs in such areas heavily relies on how the vehicle defines drivable areas based on perception information.
no code implementations • CVPR 2023 • Simin Chen, Hanlin Chen, Mirazul Haque, Cong Liu, Wei Yang
Recent advancements in deploying deep neural networks (DNNs) on resource-constrained devices have generated interest in input-adaptive dynamic neural networks (DyNNs).
no code implementations • 21 Dec 2022 • Mengqi Guo, Chen Li, Hanlin Chen, Gim Hee Lee
In view of this, we explore the task of incremental learning for NIRs in this work.
no code implementations • 29 Sep 2021 • Hanlin Chen, Ming Lin, Xiuyu Sun, Hao Li
Based on these new discoveries, we propose i) a novel hybrid zero-shot proxy which outperforms existing ones by a large margin and is transferable among popular search spaces; ii) a new index for better measuring the true performance of ZS-NAS proxies in constrained NAS.
no code implementations • 8 Sep 2020 • Hanlin Chen, Li'an Zhuo, Baochang Zhang, Xiawu Zheng, Jianzhuang Liu, Rongrong Ji, David Doermann, Guodong Guo
In this paper, binarized neural architecture search (BNAS), with a search space of binarized convolutions, is introduced to produce extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
no code implementations • ECCV 2020 • Hanlin Chen, Baochang Zhang, Song Xue, Xuan Gong, Hong Liu, Rongrong Ji, David Doermann
Deep convolutional neural networks (DCNNs) have dominated as the best performers in machine learning, but can be challenged by adversarial attacks.
no code implementations • CVPR 2020 • Li'an Zhuo, Baochang Zhang, Linlin Yang, Hanlin Chen, Qixiang Ye, David Doermann, Guodong Guo, Rongrong Ji
Conventional learning methods simplify the bilinear model by regarding two intrinsically coupled factors independently, which degrades the optimization procedure.
no code implementations • 30 Apr 2020 • Li'an Zhuo, Baochang Zhang, Hanlin Chen, Linlin Yang, Chen Chen, Yanjun Zhu, David Doermann
To this end, a Child-Parent (CP) model is introduced to a differentiable NAS to search the binarized architecture (Child) under the supervision of a full-precision model (Parent).
no code implementations • 25 Nov 2019 • Hanlin Chen, Li'an Zhuo, Baochang Zhang, Xiawu Zheng, Jianzhuang Liu, David Doermann, Rongrong Ji
A variant, binarized neural architecture search (BNAS), with a search space of binarized convolutions, can produce extremely compressed models.