Activation Functions

ReGLU

Introduced by Shazeer in GLU Variants Improve Transformer

ReGLU is an activation function which is a variant of GLU. The definition is as follows:

$$ \text{ReGLU}\left(x, W, V, b, c\right) = \max\left(0, xW + b\right) \otimes \left(xV + c\right) $$

Source: GLU Variants Improve Transformer

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories