
Joint CSC@USC/CommNetSMHI Seminar
Mon, Sep 10, 2018 @ 02:00 PM  03:00 PM
Ming Hsieh Department of Electrical and Computer Engineering
Conferences, Lectures, & Seminars
Speaker: Meisam Razaviyayn, University of Southern California
Talk Title: Finding a Local Optimum of a Constrained NonConvex Optimization and its Connections to Global Optimality
Abstract: When is solving a nonconvex optimization problem easy? Despite significant research efforts to answer this question, most existing results are problem specific and cannot be applied even with simple changes in the objective function. In this talk, we provide theoretical insights to this question by answering two related questions: 1) Are all local optima of a given optimization problem globally optimal? 2) When can we compute a local optimum of a given nonconvex constrained optimization problem efficiently? In the first part of the talk, motivated by the nonconvex training problem of deep neural networks, we provide simple sufficient conditions under which any local optimum of a given highly composite optimization problem is globally optimal. Unlike many existing results in the literature, our sufficient condition applies to many nonconvex optimization problems such as training problem of nonconvex multilinear neural networks and nonlinear neural networks with pyramidal structures.
In the second part of the talk, we consider the problem of finding a local optimum of a constrained nonconvex optimization problem under strict saddle point property. We show that, unlike the unconstrained scenario, the vanilla projected gradient descent algorithm fails to escape saddle points even in the presence of a single linear constraint. We then propose a trust region algorithm which converges to second order stationary points for optimization problems with small number of linear constraints. Our algorithm is the first optimization procedure, with polynomial periteration complexity, which converges to $\epsilon$first order stationary points of a nonmanifold constrained optimization problem in $O(\epsilon^{3/2})$ iterations, and at the same time can escape saddle points under strict saddle property.
This is a joint work with Maher Nouiehed.
Biography: Meisam Razaviyayn is an assistant professor at the department of Industrial and Systems Engineering at the University of Southern California. Prior to joining USC, he was a postdoctoral research fellow in the Department of Electrical Engineering at Stanford University working with Professor David Tse. He received his PhD in Electrical Engineering with minor in Computer Science at the University of Minnesota under the supervision of Professor Tom Luo. He obtained his MS degree in Mathematics under the supervision of Professor Gennady Lyubeznik. Meisam Razaviyayn is the recipient of the Signal Processing Society Young Author Best Paper Award in 2014 and the finalist for Best Paper Prize for Young Researcher in Continuous Optimization in 2013 and 2016.
Host: Urbashi Mitra
More Info: http://csc.usc.edu/seminars/2018Fall/razaviyayn.html
More Information: 18.09.06 Meisam Razaviyayn CSC@USC Seminar .pdf
Location: Hughes Aircraft Electrical Engineering Center (EEB)  132
Audiences: Everyone Is Invited
Contact: Brienne Moore
Event Link: http://csc.usc.edu/seminars/2018Fall/razaviyayn.html