Select a calendar:
Filter April Events by Event Type:
SUNMONTUEWEDTHUFRISAT
Events for April 15, 2008
-
Santa Monica College Transfer Day
Tue, Apr 15, 2008 @ 10:00 AM - 01:00 PM
Viterbi School of Engineering Undergraduate Admission
Receptions & Special Events
Students interested in transferring to USC's Viterbi School of Engineering can explore Viterbi's programs and majors, learn about the application process, and speak directly with a Viterbi transfer advisor at SMC's Transfer Fair.
Location: Santa Monica College
Audiences: Prospective Transfer Students
Contact: Christine Hsieh
-
CS Colloq: Learning Low Dimensional Representations of High Dimensional Data
Tue, Apr 15, 2008 @ 03:30 PM - 05:00 PM
Thomas Lord Department of Computer Science
Conferences, Lectures, & Seminars
Title: Learning Low Dimensional Representations of High Dimensional DataSpeaker: Dr. Fei Sha(UC Berkeley)Abstract:
Statistical modeling of high-dimensional and complex data is a challenging task in machine learning. To tackle this problem, a very powerful strategy is to identify and exploit low-dimensional structures intrinsic to the data. For example, text and image data can often be represented as suppositions of meaningful and interpretable structures such as ``object parts'' and ``topics''. These structures are composed of visually salient image patches as well as groups of semantically related words. Examples of such learning algorithms include nonnegative matrix factorization (NMF) and latent Dirichlet allocation (LDA), where parts and topics are encoded by nonnegative basis matrices and probability distributions respectively. In this talk, I will focus on my research that have brought new and interesting developments into the frameworks of NMF and LDA. In the first project, I show how to extend the original NMF approach to learning meaningful ``audio parts'' from speech and audio data. The audio parts robustly encode harmonic structures in the voices, which are key acoustic features for building machines that can analyze complicated acoustic signals as well as human listeners. In the second project, I investigate how to incorporate supervisory information like class labels in LDA models. In the supervised LDA, topics are discovered by grouping words based on not only semantic similarity but also class label proximity. These topics yield compact representation with better predictive powers than those derived from the original unsupervised LDA. Towards the end of the talk, I will summarize briefly my work on learning other types of latent structures such as manifolds and clusters. I will then conclude by discussing all these approaches in a general perspective and speculating a few interesting directions for future work.Location: Seaver Science Library (SSL) - 150
Audiences: Everyone Is Invited
Contact: CS Colloquia