-
PhD Thesis Proposal - Basileal Imana
Fri, Sep 30, 2022 @ 09:00 AM - 10:30 PM
Thomas Lord Department of Computer Science
University Calendar
Candidate: Basileal Imana
Title: Platform-supported and Privacy-preserving Auditing of Social Media Algorithms for Public Interest
Committee: Aleksandra Korolova, John Heidemann, Bistra Dilkina, Kristina Lerman, Phebe Vayanos
Abstract:
Social media platforms are at an inflection point due to increasing pressure from society to scrutinize their algorithms and practices. Auditing these platforms for public interest has become a pressing area of academic research, policy-making and legislation. One area of concern is platform-induced bias in the delivery of ads that shape access to life opportunities such as employment, housing and education. The primary risk comes from platforms' use of relevance estimator algorithms to personalize delivery of ads. In response to the risks, policies are being proposed to mandate platforms to provide external researchers privileged access to perform audits while protecting the platforms' privacy-sensitive data. But, until now, no actionable proposal has been made on how to achieve this policy goal practically and at scale. In this work, we propose a new platform-supported auditing framework enables a privacy-preserving and generalizable method for evaluating the fairness of relevance estimators used in delivery of opportunity ads.
We prove the above statement through three studies. In the first study, we perform a black-box audit to show that Facebook's relevance estimator can result in discriminatory job ad delivery by gender. In our second ongoing study, we show our method from the first study generalizes by transforming it into a template we can apply to audit for discrimination in other areas that have not been studied well. We will use the template to audit racial bias in delivery of housing and education ads. The black-box audits in the first two studies will demonstrate how relevance estimators can result in discriminatory outcomes for opportunity ads. Finally, in our third ongoing study, we propose a platform-supported auditing framework for auditing relevance estimators that addresses the limitations of black-box methods. Our proposal is fundamentally different from prior methods in that it suggests a way for auditors to have query access to the core and proprietary algorithms of social media while preserving privacy interests of users and platforms. Taken together, the three studies will show a platform-supported method to audit relevance estimators is the key to increasing social-media transparency.
WebCast Link: https://usc.zoom.us/j/94162038836?pwd=R1Z6QXVyZG54OHRvMUJFSENETklzZz09
Audiences: Everyone Is Invited
Contact: Lizsl De Leon