New Pruning Method Based on DenseNet Network for Image Classification

28 Aug 2021  ·  Rui-Yang Ju, Ting-Yu Lin, Jen-Shiun Chiang ·

Deep neural networks have made significant progress in the field of computer vision. Recent studies have shown that depth, width and shortcut connections of neural network architectures play a crucial role in their performance. One of the most advanced neural network architectures, DenseNet, has achieved excellent convergence rates through dense connections. However, it still has obvious shortcomings in the usage of amount of memory. In this paper, we introduce a new type of pruning tool, threshold, which refers to the principle of the threshold voltage in MOSFET. This work employs this method to connect blocks of different depths in different ways to reduce the usage of memory. It is denoted as ThresholdNet. We evaluate ThresholdNet and other different networks on datasets of CIFAR10. Experiments show that HarDNet is twice as fast as DenseNet, and on this basis, ThresholdNet is 10% faster and 10% lower error rate than HarDNet.

PDF Abstract

Datasets


Results from the Paper


Ranked #10 on Image Classification on CIFAR-10 (Accuracy metric)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Image Classification CIFAR-10 ThresholdNet Accuracy 86.34 # 10
PARAMS 15.32M # 201
Top-1 Accuracy 86.34 # 35
Parameters 15.32M # 11

Methods