Feedforward Networks

Position-Wise Feed-Forward Layer

Introduced by Vaswani et al. in Attention Is All You Need

Position-Wise Feed-Forward Layer is a type of feedforward layer consisting of two dense layers that applies to the last dimension, which means the same dense layers are used for each position item in the sequence, so called position-wise.

Source: Attention Is All You Need

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 47 6.48%
Semantic Segmentation 28 3.86%
Large Language Model 21 2.90%
Question Answering 20 2.76%
Object Detection 18 2.48%
In-Context Learning 15 2.07%
Retrieval 13 1.79%
Image Classification 12 1.66%
Denoising 12 1.66%

Categories