-
Learning Longer Memory in Recurrent Networks
Tue, May 12, 2015 @ 11:00 AM - 12:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Tomas Mikolov , Research Scientist at Facebook AI Reseach
Talk Title: Learning Longer Memory in Recurrent Networks
Series: AISeminar
Abstract: Recurrent neural network is a powerful model that learns temporal patterns in sequential data. For a long time, it was believed that recurrent networks are difficult to train using simple optimizers, such as stochastic gradient descent, due to the so-called vanishing gradient problem. In this talk, I will show that learning longer term patterns in real data, such as in natural language, is perfectly possible using gradient descent. This is achieved by using a slight structural modification of the simple recurrent neural network architecture. Some of the hidden units are encouraged to change their state slowly by constraining part of the recurrent weight matrix to be close to identity, thus forming kind of a longer term memory. We evaluate our model in language modeling experiments, where we obtain similar performance to the much more complex Long Short Term Memory (LSTM) networks. This is a joint work with Armand Joulin, Sumit Chopra, Michael Mathieu and Marc'Aurelio Ranzato.
Biography: Tomas Mikolov is a research scientist at Facebook AI Research. His work includes introduction of recurrent neural networks to statistical language modeling (published as open-source RNNLM toolkit), and an efficient algorithm for estimating word representations in continuous space (the Word2vec project). His current interest is in developing techniques and datasets that would help to advance research towards artificial intelligence systems capable of natural communication with people.
Website: https://research.facebook.com/researchers/643234929129233/tomas-mikolov/
Host: Ashish Vaswani
Webcast: http://webcasterms1.isi.edu/mediasite/Viewer/?peid=19140806ae5e4116ab2644b1c1d86bbe1dLocation: Information Science Institute (ISI) - 1135
WebCast Link: http://webcasterms1.isi.edu/mediasite/Viewer/?peid=19140806ae5e4116ab2644b1c1d86bbe1d
Audiences: Everyone Is Invited