Margin Rectified Linear Unit, or Margin ReLU, is a type of activation function based on a ReLU, but it has a negative threshold for negative values instead of a zero threshhold.
Source: A Comprehensive Overhaul of Feature DistillationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
General Classification | 1 | 25.00% |
Image Classification | 1 | 25.00% |
Object Detection | 1 | 25.00% |
Semantic Segmentation | 1 | 25.00% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |