no code implementations • 24 Mar 2024 • Shaojie Li, Haichen Qu, Xinqi Dong, Bo Dang, Hengyi Zang, Yulu Gong
Exploring the application of deep learning technologies in the field of medical diagnostics, Magnetic Resonance Imaging (MRI) provides a unique perspective for observing and diagnosing complex neurodegenerative diseases such as Alzheimer Disease (AD).
no code implementations • 21 Mar 2024 • Shaojie Li, Xinqi Dong, Danqing Ma, Bo Dang, Hengyi Zang, Yulu Gong
First, for the massive data related to user evaluation provided by operators, key features are extracted by data preprocessing and feature engineering methods, and a multi-dimensional feature set with statistical significance is constructed; then, linear regression, decision tree, LightGBM, and other machine learning algorithms build multiple basic models to find the best basic model; finally, integrates Averaging, Voting, Blending, Stacking and other integrated algorithms to refine multiple fusion models, and finally establish the most suitable fusion model for operator user evaluation.
no code implementations • 20 Mar 2024 • Danqing Ma, Shaojie Li, Bo Dang, Hengyi Zang, Xinqi Dong
In addition, a FasterNet module is introduced to replace the c3 module in the YOLOv5 Backbone.
no code implementations • 15 Sep 2023 • Yuanfeng Wu, Shaojie Li, Zhiqiang Du, Wentao Zhu
Hence, we proposed BROW, a foundation model for extracting better feature representations for WSIs, which can be conveniently adapted to downstream tasks without or with slight fine-tuning.
no code implementations • 25 Jul 2023 • Shaojie Li, Yong liu
Gradient clipping is a commonly used technique to stabilize the training process of neural networks.
1 code implementation • 12 Apr 2023 • Yifeng Shi, Feng Lv, Xinliang Wang, Chunlong Xia, Shaojie Li, Shujie Yang, Teng Xi, Gang Zhang
To address these, we designed the 1st Foundation Model Challenge, with the goal of increasing the popularity of foundation model technology in traffic scenarios and promoting the rapid development of the intelligent transportation industry.
Ranked #1 on 2D Object Detection on CeyMo
no code implementations • 30 Apr 2022 • Shaojie Li, Sheng Ouyang, Yong liu
The theoretical analysis of spectral clustering mainly focuses on consistency, while there is relatively little research on its generalization performance.
no code implementations • NeurIPS 2021 • Shaojie Li, Yong liu
In the smoothness scenario, we provide generalization bounds that are not only a logarithmic dependency on the label set cardinality but a faster convergence rate of order $\mathcal{O}(\frac{1}{n})$ on the sample size $n$.
no code implementations • 9 Nov 2021 • Shaojie Li, Yong liu
We first successfully establish learning rates for these algorithms in a general nonconvex setting, where the analysis sheds insights on the trade-off between optimization and generalization and the role of early-stopping.
1 code implementation • NeurIPS 2021 • Shaojie Li, Jie Wu, Xuefeng Xiao, Fei Chao, Xudong Mao, Rongrong Ji
In this work, we revisit the role of discriminator in GAN compression and design a novel generator-discriminator cooperative compression scheme for GAN compression, termed GCC.
no code implementations • ICLR 2022 • Shaojie Li, Yong liu
In this paper, we provide improved generalization analyses for almost all existing generalization measures of minimax problems, which enables the minimax problems to establish sharper bounds of order $\mathcal{O}\left( 1/n \right)$, significantly, with high probability.
no code implementations • 19 Jul 2021 • Shaojie Li, Yong liu
the sample size $n$ for ERM and SGD with milder assumptions in convex learning and similar high probability rates of order $\mathcal{O} (1/n)$ in nonconvex learning, rather than in expectation.
1 code implementation • 26 Mar 2021 • Shaojie Li, Mingbao Lin, Yan Wang, Yongjian Wu, Yonghong Tian, Ling Shao, Rongrong Ji
Besides, a self-distillation module is adopted to convert the feature map of deeper layers into a shallower one.
1 code implementation • 20 Jan 2021 • Mingbao Lin, Rongrong Ji, Shaojie Li, Yan Wang, Yongjian Wu, Feiyue Huang, Qixiang Ye
Inspired by the face recognition community, we use a message passing algorithm Affinity Propagation on the weight matrices to obtain an adaptive number of exemplars, which then act as the preserved filters.
1 code implementation • 17 Nov 2020 • Shaojie Li, Mingbao Lin, Yan Wang, Fei Chao, Ling Shao, Rongrong Ji
The latter simultaneously distills informative attention maps from both the generator and discriminator of a pre-trained model to the searched generator, effectively stabilizing the adversarial training of our light-weight model.
1 code implementation • 23 Jan 2020 • Mingbao Lin, Liujuan Cao, Shaojie Li, Qixiang Ye, Yonghong Tian, Jianzhuang Liu, Qi Tian, Rongrong Ji
Our approach, referred to as FilterSketch, encodes the second-order information of pre-trained weights, which enables the representation capacity of pruned networks to be recovered with a simple fine-tuning procedure.