BME Special Seminar
Fri, Jan 20, 2017 @ 02:30 PM - 03:30 PM
Alfred E. Mann Department of Biomedical Engineering
Conferences, Lectures, & Seminars
Speaker: Dominique Duncan, Assistant Professor of Neurology, Keck School of Medicine, LONI
Talk Title: Predicting Epileptogenesis after Traumatic Brain Injury and Using Virtual Reality to Correct Segmentation Errors in MRI
Abstract: The first part of my talk focuses on identifying biomarkers that can predict epileptogenesis after traumatic brain injury (TBI). This project, The Epilepsy Bioinformatics Study for Antiepileptogenic Therapy (EpiBioS4Rx), is a multi-site, international collaboration including a parallel study of humans and rats, collecting MRI, EEG, and blood samples.
Because the development of epilepsy following TBI is a multifactorial process and
crosses multiple modalities, identifying biomarkers to quantify the condition has proved difficult. Without a full understanding of the underlying biological effects, there are currently no cures for epilepsy. This study hopes to address both issues, calling upon data generated and collected at sites spread worldwide among different laboratories, clinical sites, in different formats, and across multicenter preclinical trials. Before these data can even be analyzed, a central platform is needed to standardize these data and provide tools for searching, viewing, annotating, and analyzing them. We are building a centralized data archive for EEG that will link to the Laboratory of Neuro Imaging (LONI) Image Data Archive (IDA) for MRI data and allow the broader epilepsy research community to access this shared data in addition to analytic tools to identify and validate biomarkers of epileptogenesis in images and electrophysiology as well as in molecular, serological, and tissue studies.
The second part of this talk focuses on crowdsourcing manual validation of algorithmically-segmented brain volumes using virtual reality. LONI has the largest collection/repository of neuroanatomical MRI scans in the world. One of the lab's workflow processes involves algorithmic segmentation of the scans into labeled anatomical regions using FreeSurfer software. Since this automation cannot yet achieve perfect accuracy, there is a team of students who are trained to fix these errors manually, which is a tedious, time-consuming process. We are working on transforming the way this is accomplished using VR technology (HTC Vive) to deal with the volumes directly in 3D space, which aims to be both more intuitive and efficient. The goal is to crowdsource this task to make the process even more efficient.
Host: Biomedical Engineering Department
Audiences: Everyone Is Invited
Contact: Mischalgrace Diasanta