Mon, Mar 20, 2023 @ 11:00 AM - 12:00 PM
Thomas Lord Department of Computer Science
PHD Thesis Defense - Dimitris Stripelis
Heterogeneous Federated Learning
Jose-Luis Ambite (Chair), Cyrus Shahabi, Paul Thompson, Greg Ver Steeg
Data relevant to machine learning problems are distributed across multiple data silos that cannot share their data due to regulatory, competitiveness, or privacy reasons. Federated Learning has emerged as a standard computational paradigm for distributed training of machine learning and deep learning models across silos. However, the participating silos may have heterogeneous system capabilities and data specifications. In this thesis, we address the challenges in federated learning arising from both computational and semantic heterogeneities. We present federated training policies that accelerate the convergence of the federated model and lead to reduced communication, processing, and energy costs during model aggregation, training, and inference. We show the efficacy of these policies across a wide range of challenging federated environments with highly diverse data distributions in benchmark domains and in neuroimaging. We conclude by describing the federated data harmonization problem and presenting a comprehensive federated learning and integration system architecture that addresses the critical challenges of secure and private federated data harmonization, including schema mapping, data normalization, and data imputation.
Audiences: Everyone Is Invited
Contact: Asiroh Cham