Activation Functions

GeGLU

Introduced by Shazeer in GLU Variants Improve Transformer

GeGLU is an activation function which is a variant of GLU. The definition is as follows:

$$ \text{GeGLU}\left(x, W, V, b, c\right) = \text{GELU}\left(xW + b\right) \otimes \left(xV + c\right) $$

Source: GLU Variants Improve Transformer

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories