Logo: University of Southern California

Events Calendar



Select a calendar:



Filter January Events by Event Type:


SUNMONTUEWEDTHUFRISAT
30
31
1
2
3
4
5

6
7
8
9
10
11
12

13
15
16
18
19

20
21
24
25
26

27
28
29
30
31
1
2


Conferences, Lectures, & Seminars
Events for January

  • Assane Gueye: Quantifying Network Vulnerability to Attacks: A Game Theoretic Approach

    Mon, Jan 14, 2013 @ 01:30 PM - 03:00 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Assane Gueye, National Institute of Standards and Technology (NIST)

    Talk Title: Quantifying Network Vulnerability to Attacks: A Game Theoretic Approach

    Series: CS Colloquium

    Abstract: Designing network topologies that are robust and resilient to attacks has been and continues to be an important and challenging topic in the area of communication networks. One of the main difficulties resides in quantifying the robustness of a network in the presence of an intelligent attacker who might exploit the structure of the network topology to design harmful attacks. To capture the strategic nature of the interactions between a defender and an adversary, game theoretic models have been gaining a lot of interest in the study of the security of communication networks. In a recent line of research, network blocking games have been introduced and applied to the analysis of the robustness of network topologies. A network blocking game takes as input the communication model and the topology of a network and models the strategic interactions between an adversary and the network operator as a two-player game. The Nash equilibrium strategies are then used to predict the most likely attacker's actions and the attacker's Nash equilibrium payoff serves as a quantification of the vulnerability of the network.

    In this talk, I will present the notion of network blocking games and show how they can be used to derive network vulnerability metrics by using a series of examples of communication models. I will also show how these metrics can be used to design networks that are robust against attacks and/or strengthen the robustness of existing networks. I will also show how the metrics can be used to identify the most critical links in a network.

    This is joint work with Prof. Jean C. Walrand, Prof. Venkat Anantharam (UC Berkeley), Dr. Vladimir Marbukh (NIST), and Aron Lazska (Budapest University of Technology and Economics).


    Biography: Assane Gueye is a NIST-ARRA postdoctoral researcher in the Information Technology Laboratory (ITL) at the National Institute of Standards and Technology (NIST). He received his Ph.D. in Communication Engineering (March 2011) from the EECS department at the University of California, Berkeley and his MSE (September 2004) in Communication Systems from the Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland. Assane's current research is on the application of game theoretic models to communication and cyber security. His past research includes bottleneck identification in complex network, performance evaluation of wireless cellular networks and sensor network deployment in unknown environment. Assane is currently working in join collaboration with NIST and the University of Maryland in College Park.

    Host: Milind Tambe

    Location: Ronald Tutor Hall of Engineering (RTH) - 306

    Audiences: Everyone Is Invited

    Contact: Assistant to CS chair

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • CS Colloquium: Katrina Ligett

    Thu, Jan 17, 2013 @ 03:30 PM - 05:00 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Katrina Ligett, Caltech

    Talk Title: CS Colloquium: Katrina Ligett

    Series: CS Colloquium

    Abstract: In this talk, we consider the problem of estimating a potentially
    sensitive (individually stigmatizing) statistic on a population. In our model, individuals are concerned about their privacy, and
    experience some cost as a function of their privacy loss.
    Nevertheless, they would be willing to participate in the survey if they were compensated for their privacy cost. These cost functions are not publicly known, however, nor do we make Bayesian assumptions about their form or distribution. Individuals are rational and will misreport their costs for privacy if doing so is in their best interest. Ghosh and Roth recently showed in this setting, when costs for privacy loss may be correlated with private types, if individuals value differential privacy, no individually rational direct revelation mechanism can compute any non-trivial estimate of the population statistic. In this paper, we circumvent this impossibility result by proposing a modified notion of how individuals experience cost as a function of their privacy loss, and by giving a mechanism which does not operate by direct revelation. Instead, our mechanism has the ability to randomly approach individuals from a population and offer them a take-it-or-leave-it offer. This is intended to model the abilities of a surveyor who may stand on a street corner and approach passers-by.

    Joint work with Aaron Roth.

    Biography: Katrina Ligett has been an Assistant Professor in Computer Science and Economics at the California Institute of Technology since 2011. Prior to joining Caltech, she was a postdoctoral scholar at Cornell University, and she received her PhD in Computer Science at Carnegie Mellon University in 2009. Katrina's research interests are in algorithms, particularly online algorithms, algorithmic game theory, and data privacy. Her research has been supported by an AT&T Labs Graduate Research Fellowship, an NSF Graduate Research Fellowship, a Computing Innovation Fellows Postdoctoral Fellowship, and an NSF Mathematical Sciences Postdoctoral Fellowship.

    Host: Shaddin Dughmi

    More Information: LIGETT_BUYINGPRIVACY.pdf

    Location: Seaver Science Library (SSL) - 150

    Audiences: Everyone Is Invited

    Contact: Assistant to CS chair

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File
  • Jinwoo Kim: A Statistical Ontology-Based Approach to Ranking for Multi-Word Search

    Tue, Jan 22, 2013 @ 12:00 PM - 02:00 PM

    Thomas Lord Department of Computer Science

    Conferences, Lectures, & Seminars


    Speaker: Jinwoo Kim, USC Computer Science; Phd

    Talk Title: A Statistical Ontology-Based Approach to Ranking for Multi-Word Search

    Series: PhD Defense Announcements

    Abstract: Title: A STATISTICAL ONTOLOGY-BASED APPROACH TO RANKING FOR MULTI-WORD SEARCH

    Candidate: Jinwoo Kim
    Department: Computer Science

    Date: January 22nd
    Time: 12:00pm
    Location: SAL 222

    Committee:
    Dennis McLeod (chair)
    Aiichiro Nakano
    Larry Pryor

    Abstract

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships between keywords. As a result, the accuracy (precision and recall rate) is often unsatisfactory, and the ranking algorithms fail to properly reflect the semantic relevance of keywords.

    Our research particularly focuses on increasing the accuracy of search results for multi-word search. We propose a statistical ontology-based semantic ranking algorithm based on sentence units, and a new type of query interface including wildcards. First, we allocate higher-ranking scores to keywords located in the same sentence compared with keywords located in separate sentences. While existing statistical search algorithms such as N-gram only consider sequences of adjacent keywords, our approach is able to calculate sequences of non-adjacent keywords as well as adjacent keywords.
    Second, we propose a slightly different type of query interface, which considers a wildcard as an independent unit of a search query - to reflect what users are actually seeking by way of the function of query prediction based on not query data but actual Web data. Unlike current information retrieval approaches such as proximity, statistical language modeling, query prediction and query answering, our statistical ontology-based model synthesizes proximity concept and statistical approaches into a form of ontology. This ontology helps to improve web information retrieval accuracy.

    We validated our methodology with a suite of experiments using the Text Retrieval Conference document collection. We focused on two-word queries in our experiments - as two-word queries are quite common. After applying our statistical ontology-based algorithm to the Nutch search engine, we compared the results with results of the original Nutch search and Google Desktop Search. The result demonstrates that our methodology has improved accuracy quite significantly.

    Host: Lizsl de Leon

    Location: Henry Salvatori Computer Science Center (SAL) - 222

    Audiences: Department Only

    Contact: Assistant to CS chair

    Add to Google CalendarDownload ICS File for OutlookDownload iCal File