Select a calendar:
Filter January Events by Event Type:
SUNMONTUEWEDTHUFRISAT
Events for January 19, 2018
-
Transfer Day
Fri, Jan 19, 2018
Viterbi School of Engineering Undergraduate Admission
Receptions & Special Events
TRANSFER DAY FEATURES: A presentation from Viterbi Admission, Campus Tours, Academic Department Visits, and more!
If you have questions about engineering and the transfer process then Transfer Day is for you. Transfer Day is a half-day comprehensive program designed to give you the most in-depth look at the transfer process and academic life at USC. Specifically, the program includes presentations on the admission process, transfer credit policy, academics, financial aid. You will also have the opportunity to visit an academic department or take a campus tour. Reservations are required.
RSVPLocation: Ronald Tutor Campus Center (TCC) - USC Admission Office
Audiences: Everyone Is Invited
Contact: Viterbi Admission
-
W.V.T. RUSCH ENGINEERING HONORS COLLOQUIUM
Fri, Jan 19, 2018 @ 01:00 PM - 01:50 PM
USC Viterbi School of Engineering
Conferences, Lectures, & Seminars
Speaker: Michael Affeldt, Director, LARiverWorks at Office of Los Angeles Mayor Eric Garcetti
Talk Title: Los Angeles River Revitalization Program
Abstract:
Host: Dr. Prata & EHP
Location: Henry Salvatori Computer Science Center (SAL) - 101
Audiences: Everyone Is Invited
Contact: Su Stevens
-
NL Seminar-Attention Is All You Need
Fri, Jan 19, 2018 @ 03:00 PM - 04:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Ashish Vaswani, Jakob Uszkoreit, and Niki Parmar , Google Brain
Talk Title: Attention Is All You Need
Series: Natural Language Seminar
Abstract: The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English to German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state of the art BLEU score of 41.8 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Biography: Niki Parmar is currently a Research Engineer in Google Brain, where she works on generative modeling for tasks across different modalities like Machine Translation, conditional Image generation and super-resolution. Previous to Brain, she worked within Google Research to experiment and evaluate models for Query Similarity and Question Answering used within Search. Niki recieved her Masters in Computer Science from USC before joining Google.
Jakob Uszkoreit is currently a member of the Google Brain research team. There, he works on neural networks generating text, images and other modalities in tasks such as machine translation or image super-resolution. Before joining Brain, Jakob led teams in Google Research and Search, developing neural network models of language that learn from weak supervision at very large scale and designing the semantic parser of the Google Assistant. Prior to that, he worked on Google Translate in its early years. Jakob received his MSc in Computer Science and Mathematics from the Technical University of Berlin in 2008.
Ashish Vaswani is Research Scientist at Google Brain, where he works with fun people on non-sequential generative models that seem to translate well and generate reasonable images of cars and faces. He's also interested in non autoregressive models for generating structured outputs. Before Brain, he spent many wonderful years at ISI, first as a PhD student, working on fast training of neural language models and MDL inspired training of latent-variable models with David Chiang and Liang Huang, and later as a scientist. He misses his colleagues in LA but he prefers the weather in San Francisco.
Host: Marjan Ghazvininejad and Kevin Knight
More Info: http://nlg.isi.edu/nl-seminar/
Location: Information Science Institute (ISI) - 11th Flr Conf Rms # 1135 and #1137, Marina Del Reys
Audiences: Everyone Is Invited
Contact: Peter Zamar
Event Link: http://nlg.isi.edu/nl-seminar/