Attention Dropout is a type of dropout used in attention-based architectures, where elements are randomly dropped out of the softmax in the attention equation. For example, for scaled-dot product attention, we would drop elements from the first term:
$$ {\text{Attention}}(Q, K, V) = \text{softmax}\left(\frac{QK^{T}}{\sqrt{d_k}}\right)V $$
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
RAG | 186 | 16.53% |
Retrieval | 143 | 12.71% |
Question Answering | 59 | 5.24% |
Language Modelling | 48 | 4.27% |
Language Modeling | 45 | 4.00% |
Large Language Model | 42 | 3.73% |
Sentiment Analysis | 21 | 1.87% |
Text Classification | 20 | 1.78% |
Text Generation | 19 | 1.69% |