-
Center of Autonomy and AI, Center for Cyber-Physical Systems and the Internet of Things, and Ming Hsieh Institute Seminar Series
Wed, Apr 20, 2022 @ 02:00 PM - 03:00 PM
Ming Hsieh Department of Electrical and Computer Engineering
Conferences, Lectures, & Seminars
Speaker: Bichen Wu, Meta (former Facebook) Reality Labs
Talk Title: Efficient Deep Learning for Computer Vision
Series: Center for Cyber-Physical Systems and Internet of Things
Abstract: Deep neural networks are empowering increasingly more applications in computer vision. To deploy deep learning to more devices (edge, mobile, AR/VR glasses), we need to tackle numerous challenges to make deep learning more efficient. In this talk, we will focus on two important aspects of efficiency: model efficiency and data efficiency. Improving model efficiency enables packing stronger AI capabilities to devices with limited compute, while better data efficiency unblocks more applications constrained by lack of data. This talk will first introduce the FBNet series of work [1-4], which studies neural architecture search (NAS) methods to automatically develop compute-efficient models to achieve better accuracy-efficiency trade-offs. For data efficiency, this talk will introduce OTTER [5], a data-efficient algorithm that uses language to train vision models to recognize images in a zero-shot manner -- being able to recognize new classes without needing extra labels.
Papers related to this talk:
[1] FBNetV1: Wu, Bichen, et al. "Fbnet: Hardware-aware efficient convnet design via differentiable neural architecture search." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019.
[2] FBNetV2: Wan, Alvin, et al. "Fbnetv2: Differentiable neural architecture search for spatial and channel dimensions." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.
[3] FBNetV3: Dai, Xiaoliang, et al. "FBNetV3: Joint Architecture-Recipe Search using Neural Acquisition Function." arXiv preprint arXiv:2006.02049 (2020).
[4] FBNetV5: Wu, Bichen, et al. "FBNetV5: Neural Architecture Search for Multiple Tasks in One Run."
[5] OTTER: Wu, Bichen, et al. "Data Efficient Language-Supervised Zero-Shot Recognition with Optimal Transport Distillation."
Biography: Dr. Bichen Wu is a research scientist at Meta (former Facebook) Reality Labs. His research is focused on efficient deep learning algorithms, models, and systems, aiming to bring AI capabilities to massive edge devices and applications. His paper on Neural Architecture Search -- FBNet, is among the top 0.01% highest cited computer science papers published in 2019. He obtained his Ph.D. from Berkeley AI Research, UC Berkeley and his Bachelor of Engineering from Tsinghua University in 2013.
Host: Pierluigi Nuzzo, nuzzo@usc.edu
Webcast: https://usc.zoom.us/webinar/register/WN_zyIBh_1gQLmKpMJG0GyLxwLocation: Online
WebCast Link: https://usc.zoom.us/webinar/register/WN_zyIBh_1gQLmKpMJG0GyLxw
Audiences: Everyone Is Invited
Contact: Talyia White