TeLU: A New Activation Function for Deep Learning

1 Jan 2021  ·  Marina Adriana Mercioni, Stefan Holban ·

In this paper we proposed two novel activation functions, which we called them TeLU and TeLU learnable. These proposals are a combination of ReLU (Rectified Linear Unit), tangent(tanh), and ELU (Exponential Linear Units) without and with a learnable parameter. We prove that the activation functions TeLU and TeLU learnable give better results than other popular activation functions, including ReLU, Mish, TanhExp, using current architectures tested on Computer Vision datasets.

PDF
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods