-
Theory Lunch
Thu, Nov 14, 2019 @ 12:15 PM - 02:00 PM
Thomas Lord Department of Computer Science
Conferences, Lectures, & Seminars
Speaker: Mengxiao Zhang, CS PhD Student
Talk Title: Gradient Descent Provably Optimizes Over-Parameterized Neural Networks
Abstract: This talk is on the paper "Gradient Descent Provably Optimizes Over-Parameterized Neural Networks," which is about how techniques like gradient descent have zero training loss even for objective functions that are non-convex and non-smooth.
Host: Shaddin Dughmi
Location: Henry Salvatori Computer Science Center (SAL) - 213
Audiences: Everyone Is Invited
Contact: Cherie Carter