Low-Rank Factorization-based Multi-head Attention Mechanism, or LAMA, is a type of attention module that uses low-rank factorization to reduce computational complexity. It uses low-rank bilinear pooling to construct a structured sentence representation that attends to multiple aspects of a sentence.
Source: Low Rank Factorization for Compact Multi-Head Self-AttentionPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 11 | 20.37% |
Image Inpainting | 4 | 7.41% |
In-Context Learning | 3 | 5.56% |
Question Answering | 3 | 5.56% |
Text Classification | 3 | 5.56% |
Meta-Learning | 2 | 3.70% |
Retrieval | 2 | 3.70% |
Sentiment Analysis | 2 | 3.70% |
Decoder | 1 | 1.85% |