BEGIN:VCALENDAR METHOD:PUBLISH PRODID:-//Apple Computer\, Inc//iCal 1.0//EN X-WR-CALNAME;VALUE=TEXT:USC VERSION:2.0 BEGIN:VEVENT DESCRIPTION:Speaker: Mengxiao Zhang, CS PhD Student Talk Title: Gradient Descent Provably Optimizes Over-Parameterized Neural Networks Abstract: This talk is on the paper "Gradient Descent Provably Optimizes Over-Parameterized Neural Networks," which is about how techniques like gradient descent have zero training loss even for objective functions that are non-convex and non-smooth. Host: Shaddin Dughmi SEQUENCE:5 DTSTART:20191114T121500 LOCATION:SAL 213 DTSTAMP:20191114T121500 SUMMARY:Theory Lunch UID:EC9439B1-FF65-11D6-9973-003065F99D04 DTEND:20191114T140000 END:VEVENT END:VCALENDAR