Logo: University of Southern California

Events Calendar



Select a calendar:



Filter June Events by Event Type:



Events for June 06, 2022

  • PhD Defense - Brandon Paulsen

    Mon, Jun 06, 2022 @ 09:00 AM - 11:00 AM

    Thomas Lord Department of Computer Science

    University Calendar


    PhD Candidate: Brandon Paulsen

    Title: Differential Verification of Deep Neural Networks

    Time: Monday, June 6 9:00AM PST

    Location: https://usc.zoom.us/j/94495717078?pwd=WXhJOTN5YVVKNFB3K2ExSVZSakdkZz09

    Committee:
    Chao Wang (Advisor)
    Jyotirmoy Deshmukh
    Murali Annavaram

    Abstract:
    Neural networks have become an integral component of cyber-physical systems, such as autonomous vehicles, automated delivery robots, and factory robots, and they have great potential in many other systems as well. However, flaws in these models are frequently discovered, and thus in high-stakes applications, ensuring their safety, robustness, and reliability is crucial. While many prior works have been devoted to this problem domain, they are limited because they primarily focus on a few narrowly defined safety properties, and they only focus on the most common neural network architectures and activation functions.

    This dissertation addresses these limitations by (1) studying a new class of safety properties -- differential properties -- for neural networks, and (2) developing accurate algorithms for formally proving (or disproving) them that are applicable to general neural network architectures and activation functions. We focus on neural network equivalence as the canonical example for a differential property, however other safety properties concerning input sensitivity and stability can be cast as differential properties as well.

    This dissertation makes four key contributions towards developing accurate and general algorithms for proving differential properties. First, we formalize the equivalence problem for neural networks, and then develop a novel technique based on interval analysis for proving equivalence of any two structurally similar feed-forward neural networks with ReLU activations. The key insight in this technique is in deriving formulas that relate the intermediate computations of the two neural networks, which allows us to accurately bound the maximum difference between the two networks over all inputs. Second, we develop a novel symbolic technique that further improves the analysis' accuracy.
    We demonstrate the effectiveness of these two techniques in proving equivalence of compressed neural networks with respect to the original neural networks. Finally, we propose two novel techniques for automatically synthesizing linear approximations for arbitrary nonlinear functions, thus allowing our differential techniques to apply to architectures and activation functions beyond feed-forward ReLU networks. We demonstrate that our synthesized linear approximations significantly improve accuracy versus the best alternative techniques.


    WebCast Link: https://usc.zoom.us/j/94495717078?pwd=WXhJOTN5YVVKNFB3K2ExSVZSakdkZz09

    Audiences: Everyone Is Invited

    Contact: Lizsl De Leon

    OutlookiCal