Thu, Apr 27, 2017 @ 04:00 PM - 05:00 PM
Conferences, Lectures, & Seminars
Speaker: Matus Telgarsky, UIUC
Talk Title: Representation power of neural networks
Series: Yahoo! Labs Machine Learning Seminar Series
Abstract: This lecture satisfies requirements for CSCI 591: Computer Science Research Colloquium.
This talk will present a series of mathematical vignettes on the representation power of neural networks. Amongst old results, the classical universal approximation theorem will be presented, along with Kolmogorov's superposition theorem. Recent results will include depth hierarchies (for any choice of depth, there exists functions which can only be approximated by slightly less deep networks when they have exponential size), connections to polynomials (namely, rational functions and neural networks well-approximate each other), and the power of recurrent networks. Open problems will be sprinkled throughout.
Biography: Matus Telgarsky is an assistant professor at UIUC. He received his PhD in 2013 at UCSD under Sanjoy Dasgupta. He works in machine learning theory; his current interests are non-convex optimization and neural network representation.
Host: CS Department
Audiences: Everyone Is Invited
Contact: Assistant to CS chair