Fri, Jan 19, 2018 @ 03:00 PM - 04:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Ashish Vaswani, Jakob Uszkoreit, and Niki Parmar , Google Brain
Talk Title: Attention Is All You Need
Series: Natural Language Seminar
Abstract: The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English to German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state of the art BLEU score of 41.8 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Biography: Niki Parmar is currently a Research Engineer in Google Brain, where she works on generative modeling for tasks across different modalities like Machine Translation, conditional Image generation and super-resolution. Previous to Brain, she worked within Google Research to experiment and evaluate models for Query Similarity and Question Answering used within Search. Niki recieved her Masters in Computer Science from USC before joining Google.
Jakob Uszkoreit is currently a member of the Google Brain research team. There, he works on neural networks generating text, images and other modalities in tasks such as machine translation or image super-resolution. Before joining Brain, Jakob led teams in Google Research and Search, developing neural network models of language that learn from weak supervision at very large scale and designing the semantic parser of the Google Assistant. Prior to that, he worked on Google Translate in its early years. Jakob received his MSc in Computer Science and Mathematics from the Technical University of Berlin in 2008.
Ashish Vaswani is Research Scientist at Google Brain, where he works with fun people on non-sequential generative models that seem to translate well and generate reasonable images of cars and faces. He's also interested in non autoregressive models for generating structured outputs. Before Brain, he spent many wonderful years at ISI, first as a PhD student, working on fast training of neural language models and MDL inspired training of latent-variable models with David Chiang and Liang Huang, and later as a scientist. He misses his colleagues in LA but he prefers the weather in San Francisco.
Host: Marjan Ghazvininejad and Kevin Knight
More Info: http://nlg.isi.edu/nl-seminar/
Audiences: Everyone Is Invited
Contact: Peter Zamar