Activation Functions

Margin Rectified Linear Unit

Introduced by Heo et al. in A Comprehensive Overhaul of Feature Distillation

Margin Rectified Linear Unit, or Margin ReLU, is a type of activation function based on a ReLU, but it has a negative threshold for negative values instead of a zero threshhold.

Source: A Comprehensive Overhaul of Feature Distillation

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
General Classification 1 25.00%
Image Classification 1 25.00%
Object Detection 1 25.00%
Semantic Segmentation 1 25.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories