no code implementations • 1 Jan 2021 • Marina Adriana Mercioni, Stefan Holban
In order to improve the performance of a deep neural network, the activation function is an important aspect that we must research continuously, that is why we have expanded the research in this direction.
no code implementations • 1 Jan 2021 • Marina Adriana Mercioni, Stefan Holban
In this paper we proposed two novel activation functions, which we called them TeLU and TeLU learnable.