Logo: University of Southern California

Events Calendar


  • CS Student Colloquium Series: Kuan Liu & Alireza Bagheri Garakani

    Thu, Apr 23, 2015 @ 04:00 PM - 05:00 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Kuan Liu & Alireza Bagheri Garakani, USC Computer Science

    Talk Title: Similarity Learning for High Dimensional Sparse Data; A Distributed Frank-Wolfe Algorithm for Communication-Efficient Sparse Learning

    Series: Student Seminar Series

    Abstract: Similarity Learning for High Dimensional Sparse Data
    Kuan Liu

    A good measure of similarity between data points is crucial to many tasks in machine learning. Similarity and metric learning methods learn such measures automatically from data, but they do not scale well respect to the dimensionality of the data. In this talk, we describe a method that can learn efficiently similarity measure from high dimensional sparse data. The core idea is to parameterize the similarity measure as a convex combination of rank-one matrices with specific sparsity structures. The parameters are then optimized with an approximate Frank-Wolfe procedure to maximally satisfy relative similarity constraints on the training data. Our algorithm greedily incorporates one pair of features at a time into the similarity measure, providing an efficient way to control the number of active features and thus reduce overfitting. It enjoys very appealing convergence guarantees and its time and memory complexity depends on the sparsity of the data instead of the dimension of the feature space. Our experiments on real world high-dimensional datasets demonstrate its potential for classification, dimensionality reduction and data exploration

    A Distributed Frank-Wolfe Algorithm for Communication-Efficient Sparse Learning
    Alireza Bagheri Garakani

    Learning sparse combinations is a frequent theme in machine learning. In this paper, we study its associated optimization problem in the distributed setting where the elements to be combined are not centrally located but spread over a network. We address the key challenges of balancing communication costs and optimization errors. To this end, we propose a distributed Frank-Wolfe (dFW) algorithm. We obtain theoretical guarantees on the optimization error and communication cost that do not depend on the total number of combining elements. We further show that the communication cost of dFW is optimal by deriving a lower-bound on the communication cost required to construct an epsilon-approximate solution. We validate our theoretical analysis with empirical studies on synthetic and real-world data, which demonstrate that dFW outperforms both baselines and competing methods. We also study the performance of dFW when the conditions of our analysis are relaxed, and show that dFW is fairly robust.


    Host: CS Department

    Location: Henry Salvatori Computer Science Center (SAL) - 101

    Audiences: Everyone Is Invited

    Contact: Assistant to CS chair

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File

Return to Calendar