Select a calendar:
Filter May Events by Event Type:
Events for the 3rd week of May
-
MoBI Seminar: Quanzheng Li
Mon, May 15, 2023 @ 11:00 AM - 12:00 PM
Ming Hsieh Department of Electrical and Computer Engineering
Conferences, Lectures, & Seminars
Speaker: Dr Quanzheng Li, Associate Professor of Radiology | Massachusetts General Hospital | Harvard Medical School
Talk Title: Artificial intelligence for healthcare using multimodality medical data
Series: MoBI Seminar Series
Abstract: The development of Artificial Intelligence (AI) in healthcare is currently at a critical moment, with tremendous potential for the future but an uncertain trajectory, given the rapid development over the past five years and the emergence of foundation models with astonishing capabilities. In this talk, I will discuss our recent work on using deep neural networks for PET and MR image reconstruction and denoising. Additionally, I will demonstrate how we can leverage deep learning for clinical applications using various multimodality medical data, including imaging, waveforms, electronic health records (EHRs), video, and pathology. Furthermore, I will present some of our preliminary results on the medical application of foundation models (such as GPT and SAM) and discuss potential opportunities and challenges of AI in healthcare.
Biography: Quanzheng Li is an Associate Professor of Radiology at Massachusetts General Hospital (MGH), Harvard Medical School. Dr. Li is also the senior director for research and development, data science office, Massachusetts General Brigham, and the director of the Center for Advanced Medical Computing and Analysis, Massachusetts General Hospital. He received his Ph.D degree in Electrical Engineering from the University of Southern California (USC) in 2005 and had his postdoc training also at USC with Richard Leahy. Dr. Li is the recipient of 2015 IEEE NPSS early achievement award. He is an associate editor of IEEE Transaction on Image Processing, IEEE TMI, Medical Physicis and members of editorial boards of Theronostics and Physics in Medicine and Biology. Dr. Li has more than 200 peer reviewed articles and his team has won AAPM-NIH low dose CT challenge and 2018 Camelyon Challenge on digital pthology. His research interests include image reconstruction for PET, SPECT, CT and MRI, and multimodality medical data analysis using artificial intelligence.
Host: Dr Richard Leahy, leahy@sipi.usc.edu | Dr Karim Jerbi, karim.jerbi.udem@gmail.com
Webcast: https://usc.zoom.us/j/98341725765?pwd=Zm56d2tJWEhTN2JxM1kzd1lEUUhhdz09More Information: MoBI Seminar Flyer - 05.15.2023 Quanzheng Li.pdf
Location: Hughes Aircraft Electrical Engineering Center (EEB) - EEB 349
WebCast Link: https://usc.zoom.us/j/98341725765?pwd=Zm56d2tJWEhTN2JxM1kzd1lEUUhhdz09
Audiences: Everyone Is Invited
Contact: Miki Arlen
-
Six Sigma Green Belt for Process Improvement
Tue, May 16, 2023 @ 09:00 AM - 05:00 PM
USC Viterbi School of Engineering
Conferences, Lectures, & Seminars
The Six Sigma Green Belt for Process Improvement course is a short 3 day course where you master the use of Six Sigma to identify problems in your organization, and develop plans to combat them. The Six Sigma Green Belt course is recommended for anyone looking for ways to support your company, no need for an engineering background.
Delivery options: On Campus and Online (Interactive)
Location: TBD
Audiences: Everyone Is Invited
Contact: Melissa Medeiros
Event Link: https://engage.usc.edu/viterbi/rsvp?id=387007
-
Viterbi Startup Garage: Founding & Funding Deep Tech Companies
Wed, May 17, 2023 @ 12:00 PM - 01:00 PM
Viterbi Technology Innovation and Entrepreneurship
University Calendar
Viterbi Startup Garage: Founding & Funding Deep Tech Companies
Who: Joe Wilson Managing Partner at Undeterred Capital
What: Companies face challenges when they are seeking to commercialize technological breakthroughs. How do these companies attract funding and how are they scaled over time?
When: Wed, May 17, 2023 (12-1 PM PT)
Where: Zoom
(Register for Zoom Link)
Location: Zoom
Audiences: Everyone Is Invited
Contact: VSG
Event Link: https://vsg-events.my.canva.site/vsg-oceanside-chat-may-17-2023
-
PhD Dissertation Defense - Heramb Nemlekar
Thu, May 18, 2023 @ 11:30 AM - 01:30 PM
Thomas Lord Department of Computer Science
University Calendar
PhD Dissertation Defense - Heramb Nemlekar
Committee: Gaurav Sukhatme, Heather Culbertson, Jyotirmoy Deshmukh, Satyandra K. Gupta, Stefanos Nikolaidis (Chair)
Title: Efficiently Learning Human Preferences for Proactive Robot Assistance in Assembly Tasks
Abstract:
Robots that support humans in collaborative tasks need to adapt to the individual preferences of their human partners efficiently. While prior work has mainly focused on learning human preferences from demonstrations in the actual task, obtaining this data can be expensive in real world settings such as assembly and manufacturing. Thus, this dissertation proposes leveraging prior knowledge of (i) similarities in the preferences of different users in a given task and (ii) similarities in the preferences of a given user in different tasks for efficient robot adaptation. Firstly, to leverage similarities between users, we propose a two stage approach for clustering user demonstrations to identify the dominant models of user preferences in complex assembly tasks. This allows assistive robots to efficiently infer the preferences of new users by matching their actions to a dominant preference model. We evaluate our approach in an IKEA assembly study and show that it can improve the accuracy of predicting user actions by quickly inferring the user preference. Next, to leverage similarities between tasks, we propose learning user preferences as a function of task agnostic features (e.g., the mental and physical effort of user actions) from demonstrations in a short canonical task and transferring the preferences to the actual assembly. Obtaining demonstrations in a canonical task requires less time and human effort, allowing robots to learn user preferences efficiently. In a user study with a manually designed canonical task and an actual task of assembling a model airplane, we observe that our approach can predict user actions in the actual assembly based on the task agnostic preferences learned in the canonical task. We extend our approach to account for users that change their preferences when switching tasks, by updating the transferred user preferences during the actual task. In a human to robot assembly study, we demonstrate how an assistive robot can adapt to the changing preferences of users and proactively support them, thereby reducing their idle time and enhancing their collaborative experience. Lastly, we propose a method to automatically select a canonical task suitable for the transfer learning of human preferences based on the expressiveness of the task. Our experiments show that transferring user preferences from a short but expressive canonical task improves the accuracy of predicting user actions in longer actual tasks. Overall, this dissertation proposes and evaluates novel approaches for efficiently adapting to human preferences, which can enhance the productivity and satisfaction of human workers in real-world assemblies.
Location: Henry Salvatori Computer Science Center (SAL) - 213
Audiences: Everyone Is Invited
Contact: Melissa Ochoa
Event Link: https://usc.zoom.us/j/91591350584?pwd=a2lRcE9peGFCeFBLa05sRW1vT25UUT09