-
NL Seminar-1.Leveraging Abstract Meaning Representations to Amplify the Semantic Information Captured in Transformer Models 2. Improving Multilingual Encoder With Contrastive Objective and Luna
Thu, Aug 19, 2021 @ 11:00 AM - 12:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: 1.Shira Wein 2. Leo Zeyu Liu, ISI Interns
Talk Title: 1.Leveraging Abstract Meaning Representations to Amplify the Semantic Information Captured in Transformer Models 2. Improving Multilingual Encoder With Contrastive Objective and Luna
Series: NL Seminar
Abstract: REMINDER Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you're highly encouraged to use your USC account to sign into Zoom. If you're an outside visitor, please inform nlg DASH seminar DASH admin2 AT isi.edu beforehand so we'll be aware of your attendance and let you in.
SHIRA WEIN
Though state of the art language models perform well on a variety of natural language processing tasks, these models are not exposed to explicit semantic information. We propose that language models ability to capture semantic information can be improved through the inclusion of explicit semantic information in the form of meaning representations, thus improving performance on select downstream tasks. We discuss potential ways to incorporate meaning representations and present our preliminary results.
LEO ZEYU LIU
Transformers has been successfully adapted to multilingual pretraining. With only token level losses like masked language model, transformer encoder could produce good token and sentence representations. We propose to explicitly impose sentence level objectives using contrastive learning to further improve multilingual encoder. Furthermore, we also propose to merge this modification with what a new transformer architecture, Luna, could offer disentanglement between token and sentence representations. We will also discuss ways to evaluate the models and present our experimental progress.
Biography: Shira Wein is an intern at ISI and a third year Ph.D. student at Georgetown University, working on semantic representations and multilingual cross lingual applications. Her previous work centers around L2 corpora, Abstract Meaning Representations, and information extraction from design documents, which she published on while interning at the Jet Propulsion Lab. Prior to starting her Ph.D., Shira was an undergraduate at Lafayette College, where she received a B.S. in Computer Science and B.A. in Spanish.
Leo Zeyu Liu is a Master student in Computer Science at the University of Washington, advised by Noah A. Smith and Shane Steinert Threlkeld. His research aims at interpretability, pretraining, and intersection between NLP and Linguistics. He completed his bachelor in Computer Science at the University of Washington.
Host: Jon May and Mozhdeh Gheini
More Info: https://nlg.isi.edu/nl-seminar/
Webcast: https://usc.zoom.us/j/93331739032Location: Information Science Institute (ISI) - Virtual
WebCast Link: https://usc.zoom.us/j/93331739032
Audiences: Everyone Is Invited
Contact: Pete Zamar
Event Link: https://nlg.isi.edu/nl-seminar/