Logo: University of Southern California

Events Calendar


  • CS Colloquium: Jason Lee (USC, Data Sciences and Operations)On the Foundations of Deep Learning: SGD, Overparametrization, and Generalization

    Tue, Feb 19, 2019 @ 11:00 AM - 12:00 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Jason Lee, USC, Data Sciences and Operations

    Talk Title: On the Foundations of Deep Learning: SGD, Overparametrization, and Generalization

    Series: CS Colloquium

    Abstract: We provide new results on the effectiveness of SGD and overparametrization in deep learning.

    a) SGD: We show that SGD converges to stationary points for general nonsmooth , nonconvex functions, and that stochastic subgradients can be efficiently computed via Automatic Differentiation. For smooth functions, we show that gradient descent, coordinate descent, ADMM, and many other algorithms, avoid saddle points and converge to local minimizers. For a large family of problems including matrix completion and shallow ReLU networks, this guarantees that gradient descent converges to a global minimum.

    b) Overparametrization: We show that gradient descent finds global minimizers of the training loss of overparametrized deep networks in polynomial time.

    c) Generalization:
    For general neural networks, we establish a margin-based theory. The minimizer of the cross-entropy loss with weak regularization is a max-margin predictor, and enjoys stronger generalization guarantees as the amount of overparametrization increases.

    d) Algorithmic and Implicit Regularization: We analyze the implicit regularization effects of various optimization algorithms on overparametrized networks. In particular we prove that for least squares with mirror descent, the algorithm converges to the closest solution in terms of the bregman divergence. For linearly separable classification problems, we prove that the steepest descent with respect to a norm solves SVM with respect to the same norm. For over-parametrized non-convex problems such as matrix sensing or neural net with quadratic activation, we prove that gradient descent converges to the minimum nuclear norm solution, which allows for both meaningful optimization and generalization guarantees


    This lecture satisfies requirements for CSCI 591: Research Colloquium

    Biography: Jason Lee is an assistant professor in Data Sciences and Operations at the University of Southern California. Prior to that, he was a postdoctoral researcher at UC Berkeley working with Michael Jordan. Jason received his PhD at Stanford University advised by Trevor Hastie and Jonathan Taylor. His research interests are in statistics, machine learning, and optimization. Lately, he has worked on high dimensional statistical inference, analysis of non-convex optimization algorithms, and theory for deep learning.

    Host: Yan Liu

    Location: Olin Hall of Engineering (OHE) - 132

    Audiences: Everyone Is Invited

    Contact: Assistant to CS chair

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File

Return to Calendar