BEGIN:VCALENDAR
BEGIN:VEVENT
SUMMARY:AI Seminar-Matrix Completion, Saddlepoints, and Gradient Descent
DESCRIPTION:Speaker: Jason Lee , USC
Talk Title: Matrix Completion, Saddlepoints, and Gradient Descent
Series: Artificial Intelligence Seminar
Abstract: Matrix completion is a fundamental machine learning problem with wide applications in collaborative filtering and recommender systems. Typically, matrix completion are solved by non-convex optimization procedures, which are empirically extremely successful. We prove that the symmetric matrix completion problem has no spurious local minima, meaning all local minima are also global. Thus the matrix completion objective has only saddlepoints an global minima.\n
\n
Next, we show that saddlepoints are easy to avoid for even Gradient Descent -- arguably the simplest optimization procedure. We prove that with probability 1, randomly initialized Gradient Descent converges to a local minimizer.\n
Biography: Jason Lee is an assistant professor in Data Sciences and Operations at the University of Southern California. Prior to that, he was a postdoctoral researcher at UC Berkeley working with Michael Jordan. Jason received his PhD at Stanford University advised by Trevor Hastie and Jonathan Taylor. His research interests are in statistics, machine learning, and optimization. Lately, he has worked on high dimensional statistical inference, analysis of non-convex optimization algorithms, and theory for deep learning.
Host: Emilio Ferrara
DTSTART:20161014T110000
LOCATION:ISI 6th Floor -CR # 689; ISI-Marina del Rey
URL;VALUE=URI:
DTEND:20161014T120000
END:VEVENT
END:VCALENDAR