Activation Functions

DELU

Introduced by Pishchik in Trainable Activations for Image Classification

The DELU is a type of activation function that has trainable parameters, uses the complex linear and exponential functions in the positive dimension and uses the SiLU in the negative dimension.

$$DELU(x) = SiLU(x), x \leqslant 0$$ $$DELU(x) = (n + 0.5)x + |e^{-x} - 1|, x > 0$$

Source: Trainable Activations for Image Classification

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 1 100.00%

Components


Component Type
SiLU
Activation Functions

Categories