Rethinking ResNets: Improved Stacking Strategies With High Order Schemes

28 Mar 2021  ·  Zhengbo Luo, Zitang Sun, Weilian Zhou, Zizhang Wu, Sei-ichiro Kamata ·

Various deep neural network architectures (DNNs) maintain massive vital records in computer vision. While drawing attention worldwide, the design of the overall structure lacks general guidance. Based on the relationship between DNN design and numerical differential equations, we performed a fair comparison of the residual design with higher-order perspectives. We show that the widely used DNN design strategy, constantly stacking a small design (usually 2-3 layers), could be easily improved, supported by solid theoretical knowledge and with no extra parameters needed. We reorganise the residual design in higher-order ways, which is inspired by the observation that many effective networks can be interpreted as different numerical discretisations of differential equations. The design of ResNet follows a relatively simple scheme, which is Euler forward; however, the situation becomes complicated rapidly while stacking. We suppose that stacked ResNet is somehow equalled to a higher-order scheme; then, the current method of forwarding propagation might be relatively weak compared with a typical high-order method such as Runge-Kutta. We propose HO-ResNet to verify the hypothesis of widely used CV benchmarks with sufficient experiments. Stable and noticeable increases in performance are observed, and convergence and robustness are also improved. Our stacking strategy improved ResNet-30 by 2.15 per cent and ResNet-58 by 2.35 per cent on CIFAR-10, with the same settings and parameters. The proposed strategy is fundamental and theoretical and can therefore be applied to any network as a general guideline.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods