Recurrent Neural Network

tags
Neural Network

See other extensions: LSTM, GRU, (Chandar et al. 2019), (Goudreau et al. 1994), (Sutskever, Martens, and Hinton 2011), (Cho et al. 2014)

Getting Started

Some basic RNN resources. Here is some stuff to get you started: Really early RNN Work:

LSTMs:

Some other useful things:

Some ideas that I believe could use some more research:

Chandar, Sarath, Chinnadhurai Sankar, Eugene Vorontsov, Samira Ebrahimi Kahou, and Yoshua Bengio. 2019. “Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies.” In AAAI. https://arxiv.org/abs/1902.06704.
Cho, Kyunghyun, Bart van Merriënboer, Dzmitry Bahdanau, and Yoshua Bengio. 2014. “On the Properties of Neural Machine Translation: EncoderDecoder Approaches.” In Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation. Doha, Qatar: Association for Computational Linguistics.
Goudreau, M.W., C.L. Giles, S.T. Chakradhar, and D. Chen. 1994. “First-Order versus Second-Order Single-Layer Recurrent Neural Networks.” IEEE Transactions on Neural Networks.
Sutskever, Ilya, James Martens, and Geoffrey Hinton. 2011. “Generating Text with Recurrent Neural Networks.” In Proceedings of the 28th International Conference on International Conference on Machine Learning. ICML’11. Bellevue, Washington, USA: Omnipress.