Logo: University of Southern California

Events Calendar



Select a calendar:



Filter July Events by Event Type:



Conferences, Lectures, & Seminars
Events for July

  • CS Colloquium: Yuto Nakanishi (GITAI) - Space robot development at GITAI

    Tue, Jul 19, 2022 @ 03:30 PM - 04:50 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Yuto Nakanishi, GITAI

    Talk Title: Space robot development at GITAI

    Series: Computer Science Colloquium

    Abstract: GITAI is a space robotics start-up that is developing tools to reduce the risk and cost of labor in space. Our robots are capable of autonomous operations including structure assembly, handling tools, and making connections in vacuum. We are working towards a robotics space labor force that could reduce space labor costs by 100 fold.
    GITAI is unique among space start-ups in developing all the mechatronics, electronics, and software of the robot in-house to achieve a tight integration of the best technologies.
    Now, GITAI will compete for contracts in various space domains where demands already exist for versatile robots. Especially, we are focusing on developing robots in 3 fields: i)robots for commercial space stations, ii) robots for on-orbit servicing, iii) robots for lunar exploration.
    We have conducted a highly successful demo of our robot, S1, on the Internal Space Station (ISS) in October 2021. We are advancing our robotic system capabilities in support of our next in-space technology demonstrations in 2023 and 2025.
    In this talk, we will introduce our challenge of space robot development.

    Yuto Nakanishi will give his talk in person at SGM 101 and we will also host the talk over Zoom.

    Register in advance for this webinar at:

    https://usc.zoom.us/webinar/register/WN_EtiOtxcuQmSLXC5y-P8Ysw

    After registering, attendees will receive a confirmation email containing information about joining the webinar.



    Biography: Chief Robotics Officer of GITAI. Former Founder & CEO of SCHAFT. After retiring as a research associate at the University of Tokyo Graduate School of Information Science and Technology (JSK Lab), he founded the bipedal robot startup, SCHAFT, the champion of DARPA Robotics Challenge Trials in 2013. He later sold the company to Google in 2013, and had led Tokyo biped platforms development team under Google X for 5 years.


    Host: Stefanos Nikolaidis

    Webcast: https://usc.zoom.us/webinar/register/WN_EtiOtxcuQmSLXC5y-P8Ysw

    Location: Seeley G. Mudd Building (SGM) - 101

    WebCast Link: https://usc.zoom.us/webinar/register/WN_EtiOtxcuQmSLXC5y-P8Ysw

    Audiences: Everyone Is Invited

    Contact: Department of Computer Science

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • PhD Defense - Yury Zemlyanskiy

    Tue, Jul 19, 2022 @ 03:30 PM - 05:30 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Yury Zemlyanskiy, PhD Candidate

    Talk Title: Parametric and semi-parametric methods for knowledge acquisition from text

    Abstract: Knowledge acquisition, the process of extracting, processing, and storing new information, is critical to any intelligent system. Nonetheless, modern neural networks (e.g., BERT) used in natural language processing typically do not have an explicit memory component. Instead, the knowledge about the world that the models acquire is stored implicitly in the model's parameters. This proves unreliable and makes the models ill-suited for knowledge-intensive tasks that require reasoning over vast amounts of textual data. My thesis explores alternative parametric and semi-parametric methods to extract and represent knowledge from text. The main hypothesis is that we can improve the performance of modern NLP models by representing acquired knowledge in a dedicated memory. The models can access knowledge explicitly through interacting with the memory. The thesis consists of three sections: the first section focuses on parametric memory for a pre-defined set of entities. The second part explores a semi-parametric approach to representing entity-centric knowledge in a long document or entire corpus. Finally, the last part discusses memory for semantic parsing.

    Committee: Leana Golubchik, Fei Sha, Robin Jia, and Meisam Razaviyayn (external).

    Host: CSCI Department

    More Info: https://usc.zoom.us/j/5525470171

    Audiences: Everyone Is Invited

    Contact: Lizsl De Leon

    Event Link: https://usc.zoom.us/j/5525470171

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • NL Seminar-On Exploiting Context Usage in Document-Level Neural Machine Translation

    Thu, Jul 28, 2022 @ 11:00 AM - 12:00 PM

    Information Sciences Institute

    Conferences, Lectures, & Seminars


    Speaker: Jaqueline He, USC/ISI Intern

    Talk Title: On Exploiting Context Usage in Document-Level Neural Machine Translation

    Series: NL Seminar

    Abstract: This presentation will not be recorded, the seminar will be Live Only.

    REMINDER:
    Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you are highly encouraged to use your USC account to sign into Zoom.

    If you are an outside visitor, please inform us at (nlg DASH seminar DASH host AT isi DOR edu beforehand so we will be aware of your attendance and let you in.

    In-person attendance will be permitted for USC ISI faculty, staff, students only. Open to the public virtually via the zoom registration link and online.

    A crucial limitation of current sentence level machine translation systems is their inability to account for context. By processing each sentence in isolation, existing neural machine translation NMT systems are prone to missing important document level cues and demonstrate a poor understanding of inter sentential discourse properties, resulting in a noticeable quality difference between human translated and machine translated text. In this talk, we will discuss ongoing efforts to construct NMT models that can effectively harness context. We primarily focus on the popular IWSLT 17 English to French translation task, and compare against a strong concatenation based Transformer Vaswani et al. 2017 baseline.

    First, we corroborate existing findings Fernandes et al 2021 that increasing context can improve translation performance, though with diminishing returns. We hypothesize that the Transformers self attention mechanism may be insufficient for handling long range dependencies across sentences, both inside and outside of the context window. We then explore replacing the Transformer with a novel neural architecture whose attention layer is based on an exponential moving average to exploit both local and global contexts. Finally, we will discuss a chunk based strategy towards encoding and decoding text, and conclude with future directions.

    Biography: Jacqueline He is a current summer intern for the Natural Language Group at USC ISI under Professors Jonathan May and Xuezhe Ma. She recently graduated from Princeton University with a bachelor degree in Computer Science. Her current research interest orients around contextual aware neural machine translation, and she has previously worked on interpretability and ethics in NLP.

    Host: Jon May and Thamme Gowda

    More Info: https://nlg.isi.edu/nl-seminar/

    Location: Information Science Institute (ISI) - Virtual

    Audiences: Everyone Is Invited

    Contact: Pete Zamar

    Event Link: https://nlg.isi.edu/nl-seminar/

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File