no code implementations • 9 May 2019 • Naifan Zhuang, Guo-Jun Qi, The Duc Kieu, Kien A. Hua
The Long Short-Term Memory (LSTM) recurrent neural network is capable of processing complex sequential information since it utilizes special gating schemes for learning representations from long input sequences.
no code implementations • 30 May 2018 • Kevin Joslyn, Naifan Zhuang, Kien A. Hua
Music generation research has grown in popularity over the past decade, thanks to the deep learning revolution that has redefined the landscape of artificial intelligence.
no code implementations • 11 Apr 2018 • Naifan Zhuang, The Duc Kieu, Guo-Jun Qi, Kien A. Hua
The proposed model progressively builds up the ability of the LSTM gates to detect salient dynamical patterns in deeper stacked layers modeling higher orders of DoS, and thus the proposed LSTM model is termed deep differential Recurrent Neural Network (d2RNN).
no code implementations • ICCV 2015 • Vivek Veeriah, Naifan Zhuang, Guo-Jun Qi
This change in information gain is quantified by Derivative of States (DoS), and thus the proposed LSTM model is termed as differential Recurrent Neural Network (dRNN).