Logo: University of Southern California

Events Calendar

Select a calendar:

Filter July Events by Event Type:


Events for July 16, 2020

  • NL Seminar-Towards interactive story generation

    Thu, Jul 16, 2020 @ 11:00 AM - 12:00 PM

    Information Sciences Institute

    Conferences, Lectures, & Seminars

    Speaker: Mohit Iyyer, U Mass (Amherst)

    Talk Title: Towards interactive story generation

    Abstract: Story generation is difficult to computationally formalize and evaluate, and there are many important questions to ask when tackling the problem. What should we consider as the base unit of a story e.g., a sentence? a paragraph? a chapter? What kind of data should we use to train these models novels? short stories? overly simplistic mechanically turked paragraphs?) Is any model architecture currently capable of producing long-form narratives that have some semblance of coherent discourse structure, such as plot arcs and character development? When evaluating the outputs of our models, can we do better than just asking people to rate the text based on vaguely defined properties such as enjoyability? In this talk, I'll discuss my lab's ongoing work on story generation by introducing a new dataset and evaluation method that we hope will spur progress in this area, and also describing fine-tuning strategies for large scale Transformers that produce more coherent and stylistically consistent stories. A major bottleneck of these models is their memory and speed inefficiency; as such, I'll conclude by discussing heavily simplified Transformer language models that make training less expensive without sacrificing output quality.

    Biography: Mohit Iyyer is an assistant professor in computer science at the University of Massachusetts Amherst. His research focuses broadly on designing machine learning models for discourse level language generation e.g., for story generation and machine translation), and his group also works on tasks involving creative language understanding e.g., modeling fictional narratives and characters. He is the recipient of best paper awards at NAACL 2016, 2018 and a best demo award at NeurIPS 2015. He received his PhD in computer science from the University of Maryland, College Park in 2017, advised by Jordan Boyd Graber and Hal Daumé III, and spent the following year as a researcher at the Allen Institute for Artificial Intelligence.

    Host: Jon May and Emily Sheng

    More Info: https://nlg.isi.edu/nl-seminar/

    Location: Information Science Institute (ISI) - Meeting ID: 938 5732 1879 /Password: 073790

    Audiences: Everyone Is Invited

    Contact: Petet Zamar

  • PhD Defense - Haoyu Huang

    Thu, Jul 16, 2020 @ 02:00 PM - 04:00 PM

    Computer Science

    University Calendar

    Ph.D. Defense - Haoyu Huang 7/16 2:00 pm "Nova-LSM: A Distributed, Component-based LSM-tree Data Store"

    Ph.D. Candidate: Haoyu Huang
    Date: Thursday, July 16, 2020
    Time: 2:00 pm - 4:00 pm
    Committee: Shahram Ghandeharizadeh (chair), Murali Annavaram, Jyotirmoy V. Deshmukh
    Title: Nova-LSM: A Distributed, Component-based LSM-tree Data Store
    Zoom: https://usc.zoom.us/j/99943500149
    Google Meet (only if there are issues with Zoom): meet.google.com/ruu-jjiu-fbk

    The cloud challenges many fundamental assumptions of today's monolithic data stores. It offers a diverse choice of servers with alternative forms of processing capability, storage, memory sizes, and networking hardware. It also offers fast network between servers and racks such as RDMA. This motivates a component-based architecture that separates storage from processing for a data store. This architecture complements the classical shared-nothing architecture by allowing nodes to share each other's disk bandwidth. This is particularly useful with a skewed pattern of access to data by scattering a large file across many disks instead of storing it on one disk.

    This emerging component-based software architecture constitutes the focus of this dissertation. We present design, implementation, and evaluation of Nova-LSM as an example of this architecture. Nova-LSM is a component-based design of LSM-tree using RDMA. Its components implement the following novel concepts. First, they use RDMA to enable nodes of a shared-nothing architecture to share their disk bandwidth and storage. Second, they construct ranges dynamically at runtime to parallelize compaction and boost performance. Third, they scatter blocks of a file across an arbitrary number of disks and use power-of-d to scale. Fourth, the logging component separates availability of log records from their durability. These design decisions provide for an elastic system with well-defined knobs that control its performance and scalability characteristics. We present an implementation of these designs using LevelDB as a starting point. Our evaluation shows Nova-LSM scales and outperforms its monolithic counterpart by several orders of magnitude. This is especially true with workloads that exhibit a skewed pattern of access to data.

    WebCast Link: https://usc.zoom.us/j/99943500149

    Audiences: Everyone Is Invited

    Contact: Lizsl De Leon

  • Repeating EventVirtual First-Year Admission Information Session

    Thu, Jul 16, 2020 @ 04:00 PM - 05:00 PM

    Viterbi School of Engineering Undergraduate Admission

    Workshops & Infosessions

    Our virtual information session is a live presentation from a USC Viterbi admission counselor designed for high school students and their family members to learn more about the USC Viterbi undergraduate experience.Our session will cover an overview of our undergraduate engineering programs, the application process, and more on student life.Guests will be able to ask questions and engage in further discussion toward the end of the session.

    Please register here!

    Audiences: Everyone Is Invited

    View All Dates

    Contact: Viterbi Admission