Normalization

Batch Normalization aims to reduce internal covariate shift, and in doing so aims to accelerate the training of deep neural nets. It accomplishes this via a normalization step that fixes the means and variances of layer inputs. Batch Normalization also has a beneficial effect on the gradient flow through the network, by reducing the dependence of gradients on the scale of the parameters or of their initial values. This allows for use of much higher learning rates without the risk of divergence. Furthermore, batch normalization regularizes the model and reduces the need for Dropout.

We apply a batch normalization layer as follows for a minibatch $\mathcal{B}$:

$$ \mu_{\mathcal{B}} = \frac{1}{m}\sum^{m}_{i=1}x_{i} $$

$$ \sigma^{2}_{\mathcal{B}} = \frac{1}{m}\sum^{m}_{i=1}\left(x_{i}-\mu_{\mathcal{B}}\right)^{2} $$

$$ \hat{x}_{i} = \frac{x_{i} - \mu_{\mathcal{B}}}{\sqrt{\sigma^{2}_{\mathcal{B}}+\epsilon}} $$

$$ y_{i} = \gamma\hat{x}_{i} + \beta = \text{BN}_{\gamma, \beta}\left(x_{i}\right) $$

Where $\gamma$ and $\beta$ are learnable parameters.

Source: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Object Detection 42 6.33%
Semantic Segmentation 34 5.12%
Image Classification 30 4.52%
Classification 24 3.61%
Image Generation 14 2.11%
Image Segmentation 13 1.96%
Image-to-Image Translation 13 1.96%
Quantization 13 1.96%
Test-time Adaptation 12 1.81%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories