no code implementations • 30 Oct 2018 • Guoqiang Zhong, Guohua Yue, Xiao Ling
In this paper, we propose a RNN model, called Recurrent Attention Unit (RAU), which seamlessly integrates the attention mechanism into the interior of GRU by adding an attention gate.