Tue, May 03, 2022 @ 04:00 PM - 05:00 PM
Heterogeneous Federated Learning
Federated Learning has emerged as a standard computational paradigm for distributed training of deep learning models across data silos. The participating silos may have heterogeneous system capabilities and data specifications. In this thesis proposal, we classify these heterogeneities into computational and semantic. We present federated training policies that accelerate the convergence of the federated model by reducing the communication and processing cost required during training. We show the efficacy of these policies across a range of challenging federated environments with highly diverse data distributions. Finally, we introduce for the first time the federated data harmonization problem and present a comprehensive architecture that addresses both data harmonization, including schema mapping, data normalization, and data imputation, as well as federated learning.
José Luis Ambite (advisor, CS)
Cyrus Shahabi (co-advisor, CS)
Greg Ver Steeg (CS)
Meisam Razaviyayn (CS)
Paul Thompson (Keck)
Tuesday, May 3, 4:00pm-5:00pm PDT.
Audiences: Everyone Is Invited
Contact: Lizsl De Leon