Logo: University of Southern California

Events Calendar



Select a calendar:



Filter June Events by Event Type:



Conferences, Lectures, & Seminars
Events for June

  • Ph.D. Dissertation

    Thu, Jun 10, 2021 @ 10:00 AM - 12:00 PM

    Sonny Astani Department of Civil and Environmental Engineering

    Conferences, Lectures, & Seminars


    Speaker: Xin Wei, Ph.D. Candidate

    Talk Title: Integrating Systems of Desalination and Potable Reuse: Reduced Energy Consumption for Increased Water Supply

    Abstract: Zoom Info:
    https://usc.zoom.us/j/99689313718?pwd=RmM3MU42b0hGZUZHeER4VUxuUzNZdz09
    Meeting ID: 996 8931 3718
    Passcode: 048920


    Location: Zoom Meeting

    Audiences: Everyone Is Invited

    Contact: Evangeline Reyes

    OutlookiCal
  • Ph.D. Dissertation

    Fri, Jun 11, 2021 @ 10:00 AM - 12:00 PM

    Sonny Astani Department of Civil and Environmental Engineering

    Conferences, Lectures, & Seminars


    Speaker: Eyuphan Koc, Ph.D. Candidate

    Talk Title: COMPREHENSIVE ASSESSMENT OF RESILIENCE AND OTHER DIMENSIONS OF ASSET MANAGEMENT IN METROPOLIS-SCALE TRANSPORT SYSTEMS

    Abstract: Please see attached abstract.

    Zoom Info:
    https://usc.zoom/j/95397678878?pwd=WU5zYkFWdkJ5WW5SZ1NZQllwdVRNUT09
    Meeting ID: 953 9767 8878
    Passcode: 940645


    More Information: Eyuphan Koc_Abstract.pdf

    Location: Zoom Meeting

    Audiences: Everyone Is Invited

    Contact: Evangeline Reyes

    OutlookiCal
  • NL Seminar PREFIX TUNING: OPTIMIZING CONTINUOUS PROMPTS FOR GENERATION

    Thu, Jun 17, 2021 @ 11:00 AM - 12:00 PM

    Information Sciences Institute

    Conferences, Lectures, & Seminars


    Speaker: Xiang Lisa Li, Stanford Unversity

    Talk Title: PREFIX-TUNING: OPTIMIZING CONTINUOUS PROMPTS FOR GENERATION

    Series: NL Seminar

    Abstract: Fine tuning is the de facto way of leveraging large pre trained language models for downstream tasks. However, fine tuning modifies all the language model parameters and therefore necessitates storing a full copy for each task. In this paper, we propose prefix tuning, a lightweight alternative to fine tuning for natural language generation tasks, which keeps language model parameters frozen and instead optimizes a sequence of continuous task specific vectors, which we call the prefix. Prefix tuning draws inspiration from prompting for language models, allowing subsequent tokens to attend to this prefix as if it were virtual tokens. We apply prefix tuning to GPT 2 for table to text generation and to BART for summarization. We show that by modifying only 0.1 percent of the parameters, prefix tuning obtains comparable performance in the full data setting, outperforms fine tuning in low data settings, and extrapolates better to examples with topics that are unseen during training.

    Biography: Xiang Lisa Li is a first year PhD student in computer science at Stanford University, advised by Percy Liang and Tatsunori Hashimoto. She works on controllable text generation/decoding and efficient adaptation of pre trained language models. Lisa is supported by a Stanford Graduate Fellowship and is the recipient of an EMNLP Best Paper award.





    Host: Jon May and Mozhdeh Gheini

    More Info: https://nlg.isi.edu/nl-seminar/

    Webcast: https://youtu.be/TwE2m6Z991s

    Location: Information Science Institute (ISI) - Virtual Only

    WebCast Link: https://youtu.be/TwE2m6Z991s

    Audiences: Everyone Is Invited

    Contact: Petet Zamar

    OutlookiCal