-
NL Seminar-On Exploiting Context Usage in Document-Level Neural Machine Translation
Thu, Jul 28, 2022 @ 11:00 AM - 12:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Jaqueline He, USC/ISI Intern
Talk Title: On Exploiting Context Usage in Document-Level Neural Machine Translation
Series: NL Seminar
Abstract: This presentation will not be recorded, the seminar will be Live Only.
REMINDER:
Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you are highly encouraged to use your USC account to sign into Zoom.
If you are an outside visitor, please inform us at (nlg DASH seminar DASH host AT isi DOR edu beforehand so we will be aware of your attendance and let you in.
In-person attendance will be permitted for USC ISI faculty, staff, students only. Open to the public virtually via the zoom registration link and online.
A crucial limitation of current sentence level machine translation systems is their inability to account for context. By processing each sentence in isolation, existing neural machine translation NMT systems are prone to missing important document level cues and demonstrate a poor understanding of inter sentential discourse properties, resulting in a noticeable quality difference between human translated and machine translated text. In this talk, we will discuss ongoing efforts to construct NMT models that can effectively harness context. We primarily focus on the popular IWSLT 17 English to French translation task, and compare against a strong concatenation based Transformer Vaswani et al. 2017 baseline.
First, we corroborate existing findings Fernandes et al 2021 that increasing context can improve translation performance, though with diminishing returns. We hypothesize that the Transformers self attention mechanism may be insufficient for handling long range dependencies across sentences, both inside and outside of the context window. We then explore replacing the Transformer with a novel neural architecture whose attention layer is based on an exponential moving average to exploit both local and global contexts. Finally, we will discuss a chunk based strategy towards encoding and decoding text, and conclude with future directions.
Biography: Jacqueline He is a current summer intern for the Natural Language Group at USC ISI under Professors Jonathan May and Xuezhe Ma. She recently graduated from Princeton University with a bachelor degree in Computer Science. Her current research interest orients around contextual aware neural machine translation, and she has previously worked on interpretability and ethics in NLP.
Host: Jon May and Thamme Gowda
More Info: https://nlg.isi.edu/nl-seminar/
Location: Information Science Institute (ISI) - Virtual
Audiences: Everyone Is Invited
Contact: Pete Zamar
Event Link: https://nlg.isi.edu/nl-seminar/