Activation Functions

Rectified Linear Unit N

Introduced by Pishchik in Trainable Activations for Image Classification

The Rectified Linear Unit N, or ReLUN, is a modification of ReLU6 activation function that has trainable parameter n.

$$ReLUN(x) = min(max(0, x), n)$$

Source: Trainable Activations for Image Classification

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 1 100.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories