Activation Functions

Shifted Rectified Linear Unit

Introduced by Pishchik in Trainable Activations for Image Classification

The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.

$$ShiLU(x) = \alpha ReLU(x) + \beta$$

Source: Trainable Activations for Image Classification

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 1 100.00%

Components


Component Type
ReLU
Activation Functions

Categories