BEGIN:VCALENDAR
METHOD:PUBLISH
PRODID:-//Apple Computer\, Inc//iCal 1.0//EN
X-WR-CALNAME;VALUE=TEXT:USC
VERSION:2.0
BEGIN:VEVENT
DESCRIPTION:Speaker: Matus Telgarsky, UIUC
Talk Title: Representation power of neural networks
Series: Yahoo! Labs Machine Learning Seminar Series
Abstract: This lecture satisfies requirements for CSCI 591: Computer Science Research Colloquium.\n
\n
This talk will present a series of mathematical vignettes on the representation power of neural networks. Amongst old results, the classical universal approximation theorem will be presented, along with Kolmogorov's superposition theorem. Recent results will include depth hierarchies (for any choice of depth, there exists functions which can only be approximated by slightly less deep networks when they have exponential size), connections to polynomials (namely, rational functions and neural networks well-approximate each other), and the power of recurrent networks. Open problems will be sprinkled throughout.
Biography: Matus Telgarsky is an assistant professor at UIUC. He received his PhD in 2013 at UCSD under Sanjoy Dasgupta. He works in machine learning theory; his current interests are non-convex optimization and neural network representation.
Host: CS Department
SEQUENCE:5
DTSTART:20170427T160000
LOCATION:SAL 101
DTSTAMP:20170427T160000
SUMMARY:CS & ML Colloquium: Matus Telgarsky (UIUC) - Representation power of neural networks
UID:EC9439B1-FF65-11D6-9973-003065F99D04
DTEND:20170427T170000
END:VEVENT
END:VCALENDAR