Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 17434
2018 177
2018 135
2021 81
2020 70
2018 68
2019 48
2020 39
2019 37
2020 32
2021 23
2022 22
2018 20
2020 17
2022 13
2019 11
2021 11
2017 9
2019 7
2021 7
2020 7
2019 5
2021 4
2020 4
2018 3
2020 3
2020 3
2021 3
2021 3
2021 3
2019 2
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1