Activation Functions

Optimizer Activation Function

Introduced by ZHANG et al. in 0/1 Deep Neural Networks via Block Coordinate Descent

A new activation function named NIPUNA : f(x)=max⁡〖(g(x),x)〗 where g(x)=x/(〖(1+e〗^(-βx)))

Source: 0/1 Deep Neural Networks via Block Coordinate Descent

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Medical Diagnosis 1 100.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories