Logo: University of Southern California

Events Calendar



Select a calendar:



Filter January Events by Event Type:


SUNMONTUEWEDTHUFRISAT

University Calendar
Events for January

  • PhD Dissertation Defense - Weizhao Jin

    Thu, Jan 16, 2025 @ 11:00 AM - 12:00 PM

    Thomas Lord Department of Computer Science

    University Calendar


    Title: Efficiency in Privacy-Preserving Computation via Domain Knowledge  
     
    Date and Time: Thursday, January 16th, 2025, from 11:00 AM to 12:00 PM  
     
    Location: THH 221
     
    Committee: Srivatsan Ravi (CS), Bhaskar Krishnamachari (ECE), Harsha V. Madhyastha (CS), Fred Morstatter (CS)  
     
    Abstract: In recent years, the growing reliance on user data for building server-side applications and services has significantly heightened the importance of data privacy. To meet expanding privacy regulations like GDPR, service providers have turned to privacy-preserving methods that maintain computational functionality while protecting user privacy. However, integrating techniques such as homomorphic encryption into application protocols presents a critical challenge: achieving a balance between privacy and efficiency. This thesis explores two distinct domains within privacy-preserving computation, offering practical, domain-specific solutions to address challenges related to overheads and protocol complexity. The focus is on achieving efficient privacy in both machine learning and networks/IoT. To illustrate how leveraging domain-specific insights—from federated learning, entity resolution, and computer networking—can substantially enhance the efficiency of privacy-preserving computation, we first introduce a selective encryption strategy for large-scale federated learning models, reducing overhead by encrypting only sensitive parameters while still maintaining robust privacy guarantees; secondly, we demonstrate how homomorphic encryption can be optimized for deep entity resolution via a two-stage computation scheme and novel techniques including synthetic ranging and polynomial degree optimization that preserve accuracy under encrypted computation; finally, we apply Non-Interactive Zero-Knowledge proofs to achieve lightweight privacy-preserving path validation across multi-authority network slices, ensuring data forwarding compliance without revealing sensitive topology details by utilizing a backward pairwise validation procedure. Taken together, these studies highlight how targeting domain-specific challenges via domain-specific knowledge can yield practical, scalable frameworks for efficient privacy-preserving computation.
     
    Zoom Link: https://usc.zoom.us/j/99543392059?pwd=FlQxFqagihzPzEV4tfBaemgHBwOwUM.1  

    Location: Mark Taper Hall Of Humanities (THH) - 221

    Audiences: Everyone Is Invited

    Contact: Weizhao Jin

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • PhD Dissertation Defense - Mehrnoosh Mirtaheri

    Wed, Jan 22, 2025 @ 01:00 PM - 03:00 PM

    Thomas Lord Department of Computer Science

    University Calendar


    Title: Toward Learning and Forecasting with Temporal Knowledge Graphs
     
    Date and Time: Tuesday, January 22nd, 2025 - 1:00p - 3:00p
     
    Location: SAL 213
     
    Committee: Aram Galstyan (Chair), Emilio Ferrara (Tenured Faculty), Fred Morstatter, Antonio Ortega (External Faculty
     
    Abstract:  Temporal knowledge graphs (TKGs) model real-world relationships between entities over time, enabling insight extraction from unstructured data. While powerful for various applications, TKGs are inherently limited by incompleteness and noise, making their completion and forecasting crucial research areas.
    This thesis tackles the key challenges in TKG forecasting: relation sparsity in large-scale graphs, continuous integration of new data while preserving existing knowledge, and entity evolution as new entities emerge and existing ones appear in novel contexts. Through novel methodological frameworks, this research demonstrates improved predictive accuracy, robustness to data sparsity, and adaptability to evolving data, validated through extensive evaluation on both standard benchmark and real-world datasets.
     
    Zoom Link: https://usc.zoom.us/j/96220815599

    Location: Henry Salvatori Computer Science Center (SAL) - 213

    Audiences: Everyone Is Invited

    Contact: Mehrnoosh Mirtaheri

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • PhD Dissertation Defense - Mozhdeh Gheini

    Fri, Jan 24, 2025 @ 04:00 PM - 05:00 PM

    Thomas Lord Department of Computer Science

    University Calendar


    Title: Inductive Biases for Data- and Parameter-Efficient Transfer Learning
     
    Date and Time: Fri, Jan 24, 2025 @ 10:00 AM - 12:00 PM
     
    Location: Salvatori Computer Science Center (SAL) - 213 and https://usc.zoom.us/j/6564802162
     
    Committee Members: Jonathan May (Chair), Emilio Ferrara, Xuezhe Ma, Khalil Iskarous
     
    Abstract: Data- and resource-intensive pre-training and fine-tuning applied upon Transformer-based models is the dominant paradigm at the forefront of rapid advancements in natural language processing, human language technologies, and most notably, large language models. Such reliance on massive amounts of data, computation, and energy, while effective and impressive from a performance-only perspective, can hinder open, nonexclusive, and sustainable development of these technologies. In this talk, we present how certain inductive biases can be devised to adjust current natural language methods under resource-constrained scenarios and provide insights into why the proposed inductive biases are successful in such cases.  
     
    Specifically, we discuss four research directions on data and parameter efficiency of fine-tuning and transfer learning in natural language processing: (1) a universal regimen that creates a single pre-trained checkpoint suitable for machine translation transfer to practically any language pair and eliminates the need for ad hoc pre-training; (2) an architecture-guided parameter-efficient fine-tuning method that performs competitively with full fine-tuning while exclusively updating cross-attention parameters; (3) an analysis of MEGA, a recently introduced augmentation of the Transformer architecture to incorporate explicit recency bias, through the lens of transfer learning; and (4) a meta-learning algorithm to prime pre-trained models for specific fine-tuning strategies.  
     
    Combined with ablations that show how they are effective and analyses that demonstrate their generalizability, these directions are meant to serve as tools for resource-efficient transfer learning for natural language processing.  

    Location: Henry Salvatori Computer Science Center (SAL) - 213

    Audiences: Everyone Is Invited

    Contact: Mozhdeh Gheini

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File