Logo: University of Southern California

Events Calendar



Select a calendar:



Filter July Events by Event Type:


SUNMONTUEWEDTHUFRISAT

Conferences, Lectures, & Seminars
Events for July

  • NL Seminar

    Thu, Jul 01, 2021 @ 09:00 AM - 10:00 AM

    Information Sciences Institute

    Conferences, Lectures, & Seminars


    Speaker: Kalpesh Krishna , (U Mass, Amherst)

    Talk Title: Advances in Text Generation and the Perils of its Automatic Evaluation

    Series: NL Seminar

    Abstract: Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you're highly encouraged to use your USC account to sign into Zoom. If you're an outside visitor, please inform us at nlg-seminar-admin2 at isi dot edu beforehand so we'll be aware of your attendance and let you in.

    Recent advances in large scale language modeling have significantly improved the capability of natural language generation NLG systems, opening up several new applications. Unfortunately, evaluating NLG systems remains challenging, making it hard to measure meaningful progress. In this talk I will present our recent efforts in building and evaluating NLG systems for 1. unsupervised sentence-level style transfer 2. paragraph-length abstractive question answering with the ELI5 dataset. We build NLG systems using large language models with paraphrase generation and retrieval respectively, that significantly outperform prior state of the art using standard automatic metrics. Unfortunately, we discover several issues with the current evaluation setups, including trivial baselines like input copying which can game these standard metrics, even outperforming real systems. Along the way I will discuss our efforts towards rectifying these issues, and conclude with a brief mention of other projects working towards more robust NLG evaluation.


    Biography: Kalpesh is a third year PhD student at UMass Amherst, advised by Prof. Mohit Iyyer. He is primarily interested in natural language generation and the security of NLP systems. Before coming to UMass, he completed a bachelors' degree at IIT Bombay, advised by Prof. Preethi Jyothi. He has also spent time interning at Google, TTI-Chicago and Mozilla. His research is supported by a Google PhD Fellowship, which was awarded in 2021.


    Host: Jon May and Mozhdeh Gheini

    More Info: https://nlg.isi.edu/nl-seminar/

    Webcast: https://youtu.be/bv95xMBZO_U

    Location: Information Science Institute (ISI) - Virtual Only

    WebCast Link: https://youtu.be/bv95xMBZO_U

    Audiences: Everyone Is Invited

    Contact: Petet Zamar

    Event Link: https://nlg.isi.edu/nl-seminar/

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • NL SEMINAR Predictor Guided Controlled Generation

    Thu, Jul 22, 2021 @ 11:10 AM - 12:10 PM

    Information Sciences Institute

    Conferences, Lectures, & Seminars


    Speaker: Kevin Yang, UC Berkeley

    Talk Title: Predictor Guided Controlled Generation

    Abstract: REMINDER Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you're highly encouraged to use your USC account to sign into Zoom. If you're an outside visitor, please inform nlg DASH seminar DASH admin2 AT isi.edu beforehand so we'll be aware of your attendance and let you in.

    ABSTRACT:

    I will present two works on controlled generation, with a shared theme of using predictors to guide a generator. Future Discriminators for Generation FUDGE is a flexible and modular method for controlled text generation, which learns an attribute predictor operating on a partial sequence, and uses this predictor's outputs to adjust a base generator's original probabilities with no need for re-training or fine-tuning. Switching domains, I will also present Improving Molecular Design by Stochastic Iterative Target Augmentation, a self-training approach for using a strong attribute predictor to guide the training of a generator in a semi-supervised manner. Overall, we find that these predictor-guided approaches to controlled generation substantially outperform prior methods in several text generation tasks, as well as in molecular design and program synthesis.


    Biography: Kevin Yang is a rising third-year PhD student at UC Berkeley advised by Dan Klein within Berkeley NLP and BAIR. He is broadly interested in AI in the context of language and game-playing, particularly in designing more modular and/or language-controllable agents. He is also interested in neural architectures for structured domains such as chemistry. Previously, Kevin worked with Regina Barzilay during his undergrad and M.Eng. at MIT, on natural language processing and chemistry applications of deep learning, especially graph convolutional networks.

    Host: Jon May and Mozhdeh Gheini

    More Info: https://nlg.isi.edu/nl-seminar/

    Webcast: https://youtu.be/3aT3dNLyzec

    Location: Information Science Institute (ISI) - Virtual Only

    WebCast Link: https://youtu.be/3aT3dNLyzec

    Audiences: Everyone Is Invited

    Contact: Petet Zamar

    Event Link: https://nlg.isi.edu/nl-seminar/

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • NL Seminar-Should PTLMs go to school?

    Wed, Jul 28, 2021 @ 11:00 AM - 12:00 PM

    Information Sciences Institute

    Conferences, Lectures, & Seminars


    Speaker: Manuel Ciosici, ISI Boston Waltham

    Talk Title: Should PTLMs go to school?

    Series: NL Seminar

    Abstract: REMINDER Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you're highly encouraged to use your USC account to sign into Zoom. If you're an outside visitor, please inform nlg DASH seminar DASH admin2 AT isi.edu beforehand so we'll be aware of your attendance and let you in.

    I will present a couple of research vignettes of the working knowledge inside large pre trained language models. I will use the vignettes to argue for a new task to measure modern language models knowledge and ability to learn from textbooks. Unlike machines, humans do not need to read, for example, all of Wikipedia, to learn.

    For humans, reading a textbook or a manual is often enough to provide working knowledge on the books topic. We propose LEFT, a new task to measure a machines capacity to learn from the same textbooks that college graduates use to learn about society and history. The task reveals surprising results for current state of the art language models like T5 and GPT Neo.


    Biography: Manuel Ciosici is a postdoc at ISI Boston (Waltham), working with Ralph Weischedel and Marjorie Friedman on understanding the knowledge inside large language models and putting it to use, for example, in filling in sequences of events. He is also interested in Natural Language Processing for languages other than English and has recently released a large corpus of Danish to support training large language models. Before joining ISI, Manuel received his Ph.D. from Aarhus University in Denmark and was a postdoc at the IT University in Copenhagen.

    Host: Jon May and Mozhdeh Gheini

    More Info: https://nlg.isi.edu/nl-seminar/

    Webcast: https://youtu.be/ZtM7b0ggfvs

    Location: Information Science Institute (ISI) - Virtual Only

    WebCast Link: https://youtu.be/ZtM7b0ggfvs

    Audiences: Everyone Is Invited

    Contact: Pete Zamar

    Event Link: https://nlg.isi.edu/nl-seminar/

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • Guest Speaker Series: "Learning from the Pandemic to Improve Healthcare"

    Thu, Jul 29, 2021 @ 10:00 AM - 12:00 PM

    Daniel J. Epstein Department of Industrial and Systems Engineering

    Conferences, Lectures, & Seminars


    Speaker: Professor Randy Hall, Epstein ISE Dept./Director of CREATE; Hal Yee, Jr, MD, PhD, Chief Deputy Director, Clinical Affairs, and Chief Medical Officer, Los Angeles County, Moderated by Michael Cousineau, DrPH

    Talk Title: How the Pandemic Changed the Healthcare System.

    Abstract: Understanding Pandemic Driven Demand for Optimizing Patient Flow.

    Host: The Daniel J. Epstein Department of Industrial and Systems Engineering, The Gehr Family Center at the Keck School of Medicine, and the Price School of Public Policy.

    More Info: Follow link to register for event: https://usc.zoom.us/webinar/register/WN_F5NCbL-MQiKgP63JghiVsg

    More Information: Learning from the Pandemic to Improve Healthcare.pdf

    Location: Online/Zoom

    Audiences: Everyone Is Invited

    Contact: Grace Owh

    Event Link: Follow link to register for event: https://usc.zoom.us/webinar/register/WN_F5NCbL-MQiKgP63JghiVsg

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • NL Seminar- Fantastic Continuous-valued Sentence Representations and How to Find Them

    Thu, Jul 29, 2021 @ 11:00 AM - 12:00 PM

    Information Sciences Institute

    Conferences, Lectures, & Seminars


    Speaker: Nishant Subramani, Allen NLP

    Talk Title: Fantastic Continuous-valued Sentence Representations and How to Find Them

    Series: NL Seminar

    Abstract: REMINDER Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you're highly encouraged to use your USC account to sign into Zoom. If you're an outside visitor, please inform nlg DASH seminar DASH admin2 AT isi.edu beforehand so we'll be aware of your attendance and let you in.

    ABSTRACT:
    Recently, large pretrained language models have seen tremendous success in few-shot and zero shot learning settings when given appropriate prompting. For these models to excel in this few shot paradigm, the provided prompts must condition the language model to produce the desired output sequence. We investigate a more general property of this idea: does there exist a continuous, fixed length vector that can prompt a pretrained and frozen language model to generate any output sentence of interest? To answer this, we develop a language model agnostic sentence representation discovery algorithm, which learns a continuous-valued, fixed-length vector for a sequence by adding the vector at various locations in the language model and optimizing it to maximize the likelihood of the sequence. Experiments reveal that for nearly all English sentences, greater than 98 percent from different genres and corpora, we can find a vector that recovers our sentence of interest exactly without fine tuning the underlying language model. In addition, we find that our representations can be learned in real-time and are robust to initialization; convergence requires less than 20 iterations on average using stochastic gradient descent with Adam.




    Biography: Nishant is a predoctoral young investigator on the AllenNLP team at AI2 working with Matt Peters and part of the Masakhane community. He is broadly interested in sentence representation, ML for social good, and out-of-distribution generalization research. Previously, Nishant has spent time in industry working on document analysis, OCR, fake speech detection, and audio-driven facial animation. He also has worked with Kyunghyun Cho, Sam Bowman, and Doug Downey during his time at NYU and Northwestern on NLP and causality research.

    Host: Jon May and Mozhdeh Gheini

    More Info: https://nlg.isi.edu/nl-seminar/

    Webcast: https://youtu.be/pCKBSPDenpc

    Location: Information Science Institute (ISI) - Virtual Only

    WebCast Link: https://youtu.be/pCKBSPDenpc

    Audiences: Everyone Is Invited

    Contact: Petet Zamar

    Event Link: https://nlg.isi.edu/nl-seminar/

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File