-
AI Seminar-Matrix Completion, Saddlepoints, and Gradient Descent
Fri, Oct 14, 2016 @ 11:00 AM - 12:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Jason Lee , USC
Talk Title: Matrix Completion, Saddlepoints, and Gradient Descent
Series: Artificial Intelligence Seminar
Abstract: Matrix completion is a fundamental machine learning problem with wide applications in collaborative filtering and recommender systems. Typically, matrix completion are solved by non-convex optimization procedures, which are empirically extremely successful. We prove that the symmetric matrix completion problem has no spurious local minima, meaning all local minima are also global. Thus the matrix completion objective has only saddlepoints an global minima.
Next, we show that saddlepoints are easy to avoid for even Gradient Descent -- arguably the simplest optimization procedure. We prove that with probability 1, randomly initialized Gradient Descent converges to a local minimizer.
Biography: Jason Lee is an assistant professor in Data Sciences and Operations at the University of Southern California. Prior to that, he was a postdoctoral researcher at UC Berkeley working with Michael Jordan. Jason received his PhD at Stanford University advised by Trevor Hastie and Jonathan Taylor. His research interests are in statistics, machine learning, and optimization. Lately, he has worked on high dimensional statistical inference, analysis of non-convex optimization algorithms, and theory for deep learning.
Host: Emilio Ferrara
Location: Information Science Institute (ISI) - 6th Floor -CR # 689; ISI-Marina del Rey
Audiences: Everyone Is Invited
Contact: Peter Zamar