Logo: University of Southern California

Events Calendar



Select a calendar:



Filter June Events by Event Type:



University Calendar
Events for June

  • PhD Defense - Chloe LeGendre

    Fri, Jun 07, 2019 @ 02:00 PM - 03:30 PM

    Thomas Lord Department of Computer Science

    University Calendar


    PhD Candidate: Chloe LeGendre
    Committee: Paul Debevec (CS, chair), Jernej Barbic (CS), Michael Fink (Cinema)
    Title: Compositing Real and Virtual Objects with Realistic, Color-Accurate Illumination
    Room: GFS 104
    Time: June 7 (Friday) 2:00 PM - 3:30 PM

    Abstract:
    In digital compositing, images from a variety of sources are combined to form a seamless, realistic result that appears as though it could have been photographed in the real world. This technique is extensively used in visual effects and film production, for example when an actor filmed in a studio is composited into a different real or virtual environment. It is also widely used in augmented or mixed reality, where rendered virtual imagery is combined with a live video feed. For a realistic and convincing composite, the content from the various source images must appear to have been captured at the same time under the same lighting conditions. Thus, whether such content is photographed or synthetically rendered, digital compositing benefits from accurate measurement of scene illumination, and, for rendered objects, also from accurate material reflectance measurement. In this dissertation, we therefore present new techniques for the measurement and estimation of illumination and reflectance, geared towards the goal of producing realistic and color-accurate digital composites.

    First, we present a multispectral illumination measurement and playback technique to extend studio lighting reproduction for live-action compositing to the multispectral domain, improving color rendition as compared with previous work. Next, a theoretical analysis affords the selection of an optimal, minimal set of light emitting diodes (LEDs) of distinct spectra for this technique. For post-production methods, we extend image-based relighting to the multispectral domain for the same goal of improved color rendition, forming the appearance of an object in a novel lighting environment by combining images of the object as lit by an omnidirectional, multispectral lighting basis. For the specific application of digitizing humans, which enables rendering a person under a novel viewpoint or lighting environment, we also present an efficient approach to multispectral high-resolution facial scanning using monochrome cameras, where both geometric resolution and color rendition are improved compared to previous work.

    The final technique we present is a machine learning based method to estimate high dynamic range (HDR), omnidirectional illumination given only a single, unconstrained mobile phone image with a limited field-of-view. As image formation using the aforementioned post-production techniques requires accurate lighting measurement, typically realized using panoramic HDR photography and a special color calibration target, real-time mobile mixed reality compositing with convincing illumination remains elusive because these off-line light measurement techniques are both impractical and not accessible for the average mobile phone user. Our learning based approach for omnidirectional HDR lighting estimation from a single image is the first to generalize to both indoor and outdoor scenes, while comparing favorably to previous methods developed to handle only a single class of lighting.

    Location: Grace Ford Salvatori Hall Of Letters, Arts & Sciences (GFS) - 104

    Audiences: Everyone Is Invited

    Contact: Lizsl De Leon

    OutlookiCal