BEGIN:VCALENDAR BEGIN:VEVENT SUMMARY:CS & ML Colloquium: Matus Telgarsky (UIUC) - Representation power of neural networks DESCRIPTION:Speaker: Matus Telgarsky, UIUC Talk Title: Representation power of neural networks Series: Yahoo! Labs Machine Learning Seminar Series Abstract: This lecture satisfies requirements for CSCI 591: Computer Science Research Colloquium.\n \n This talk will present a series of mathematical vignettes on the representation power of neural networks. Amongst old results, the classical universal approximation theorem will be presented, along with Kolmogorov's superposition theorem. Recent results will include depth hierarchies (for any choice of depth, there exists functions which can only be approximated by slightly less deep networks when they have exponential size), connections to polynomials (namely, rational functions and neural networks well-approximate each other), and the power of recurrent networks. Open problems will be sprinkled throughout. Biography: Matus Telgarsky is an assistant professor at UIUC. He received his PhD in 2013 at UCSD under Sanjoy Dasgupta. He works in machine learning theory; his current interests are non-convex optimization and neural network representation. Host: CS Department DTSTART:20170427T160000 LOCATION:SAL 101 URL;VALUE=URI: DTEND:20170427T170000 END:VEVENT END:VCALENDAR