Tanh Activation is an activation function used for neural networks:
$$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$$
Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.
Image Source: Junxi Feng
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 21 | 3.03% |
Sentence | 20 | 2.89% |
Sentiment Analysis | 17 | 2.45% |
Time Series Forecasting | 17 | 2.45% |
Management | 15 | 2.16% |
Classification | 15 | 2.16% |
Decision Making | 14 | 2.02% |
Image Generation | 14 | 2.02% |
Image-to-Image Translation | 12 | 1.73% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |