Thu, Oct 27, 2022 @ 11:00 AM - 12:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Eric Wallace, University of Cal-Berkeley
Talk Title: Emerging Vulnerabilities in Large-scale NLP Models
Series: NL Seminar
Abstract: Abstract: REMINDER
Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you are highly encouraged to use your USC account to sign into Zoom.
If you are an outside visitor, please inform us at nlg DASH seminar DASH host AT isi DOT edu beforehand so we will be aware of your attendance and let you in.
In person attendance will be permitted for USC ISI faculty, staff, students only. Open to the public virtually via the zoom link and online.
The current era of machine learning and natural language processing is dominated by scale modern models use supermassive parameter counts, dataset sizes, and compute budgets. While this scaling undoubtedly unlocks new capabilities and performance improvements, it may also expose new vulnerabilities, risks, and harms. In this talk, I will discuss a series of vulnerabilities that emerge in large scale NLP models that not only expose worrisome security and privacy risks, but also provide new perspectives into how and why the models work. Concretely, I will show how adversaries can extract private training data, steal model weights, and poison training sets, all using limited black box access to models. Throughout the talk, I'll provide a particular focus on insights that we can derive from these attacks, especially regarding the impact of model scaling.
Biography: Eric Wallace is a 4th year PhD student at UC Berkeley advised by Dan Klein and Dawn Song. His research interests focus on making large language models more robust, trustworthy, secure, and private. Eric's work is supported by the Apple Fellowship in AI/ML, and he received the best demo award at EMNLP 2019.
Host: Jon May and Meryem M'hamdi
More Info: https://nlg.isi.edu/nl-seminar/
WebCast Link: https://www.youtube.com/watch?v=42LNH1dTlgg
Audiences: Everyone Is Invited
Contact: Pete Zamar