Select a calendar:
Filter January Events by Event Type:
Events for January 17, 2008
-
Incorporating Model Uncertainty in Service and Manufacturing Operations Management
Thu, Jan 17, 2008 @ 11:00 AM - 12:00 PM
Daniel J. Epstein Department of Industrial and Systems Engineering
University Calendar
DANIEL J. EPSTEIN DEPARTMENT OF INDUSTRIAL & SYSTEMS ENGINEERING SEMINAR"Incorporating Model Uncertainty in Service and Manufacturing Operations Management"*Dr. J. George ShanthikumarDepartment of Industrial Engineering & Operations Research, University of California at Berkeley, Berkeley, CA 94720ABSTRACT: Classical modeling approaches in Operations Management under uncertainty assume a full probabilistic characterization. The learning needed to implement the policies derived from these models is accomplished either through (i) classical statistical estimation procedures or (ii) subjective Bayesian priors. When the data available for learning is limited, or the underlying uncertainty is non-stationary, the error induced by these approaches can be significant and the effectiveness of the policies derived will be reduced. In this presentation we discuss how we may incorporate these errors in the model (that is, model model uncertainty) and use robust optimization to derive efficient policies. Different models of model uncertainty will be discussed and different approaches to robust optimization with and without bench-marking will be presented. Two alternative learning approaches Objective Bayesian Learning and Operational Learning will be discussed. These approaches could be used to calibrate the models of model uncertainty and to calibrate the optimal policies. Throughout this talk we will consider the classical inventory control, revenue management, and asset allocation problems as examples to illustrate these ideas. *This presentation is based on ongoing joint research work with Andrew E. B. Lim & Z. J. Max Shen and several current and former Ph.D. students.THURSDAY, JANUARY 17, 2008, 11:00 AM 12:00 PM, ANDRUS GERONTOLOGY BLDG (GER) 309
Location: Ethel Percy Andrus Gerontology Center (GER) - 309
Audiences: Everyone Is Invited
Contact: Georgia Lum
-
CS Colloquia: Bertrand Competition in Networks
Thu, Jan 17, 2008 @ 11:00 AM - 12:30 PM
Thomas Lord Department of Computer Science
Conferences, Lectures, & Seminars
Title: Bertrand Competition in NetworksSpeaker: Prof. Shuchi Chawla (Wisconsin)ABSTRACT:
The Internet is a unique modern artifact given its sheer size and the
number of its users. Given its continuing distributed and ad-hoc
evolution, there have been growing concerns about the effectiveness of
its current routing protocols in finding good routes and ensuring
quality of service. Imposing congestion-based and QoS-based prices on
traffic has been suggested as a way of combating the ills of this
distributed growth and selfish use of resources. Unfortunately, the
effectiveness of such approaches relies on the cooperation of the
multiple entities implementing them, namely the ISPs or Internet
service providers. The goals of the ISPs do not necessarily align with
the social objectives of efficiency and quality of service; their
primary objective is to maximize their own profit. It is therefore
imperative to study the following question: given a large
combinatorial market such as the Internet, suppose that the owners of
resources selfishly price their product to maximize their own profit,
and consumers selfishly purchase resources to maximize their own
utility, how does this effect the functioning of the market as a
whole?We study this problem in the context of a simple network pricing game,
and analyze the performance of equilibria arising in this game as a
function of the degree of competition in the game, the network
topology, and the demand structure. Economists have traditionally
studied such questions in single-item markets. It is well known, for
example, that monopolies cause inefficiency in a market by charging
high prices, whereas competition has the effect of driving prices down
and operating efficiently. Our work extends the classical Bertrand
model of competition from economics to the network setting. For
example, we ask: is competition in a network enough to ensure
efficient operation? does performance worsen as the number of
monopolies grows? does the answer depend in an interesting way on the
network topology and/or demand distribution? We provide tight bounds
on the performance (efficiency) of the network.This is joint work with Tim Roughgarden.BIO:
Shuchi Chawla is an assistant professor of Computer Science at
University of Wisconsin - Madison. She received her PhD in 2005 from
Carnegie Mellon University and her Bachelor of Technology degree from
the Indian Institute of Technology, Delhi, India in 2000. Her research
interests lie in theoretical computer science, with emphasis towards
approximation algorithms, combinatorial optimization, and game theory.
Shuchi is the recepient of an NSF CAREER award and an IBM Ph.D.
fellowship.Location: Hughes Aircraft Electrical Engineering Center (EEB) - 248
Audiences: Everyone Is Invited
Contact: CS Colloquia
-
Geological Storage as a Carbon Mitigation Option
Thu, Jan 17, 2008 @ 12:45 PM
Mork Family Department of Chemical Engineering and Materials Science
Conferences, Lectures, & Seminars
Lyman Handy Colloquium SeriesPresentingMichael Celia
Princeton UniversityAbstractThe most promising approach to solve the carbon problem involves widespread implementation of zero-emission power plants. One promising option is to use fossil fuel-based plants with carbon capture and storage (CCS) technology. While a variety of storage options are being studied, geological storage appears to be most viable. Injection of captured CO2 into deep geological formations leads to a fairly complex flow system involving multiple fluid phases, a range of potential geochemical reactions, and mass transfer across phase interfaces.
General models of this system are computationally demanding, with the problem made more difficult by the large range of spatial scales involved, and the importance of local features for both fluid flow and geochemical reactions. An especially important local feature involves leakage pathways, with one example being abandoned wells associated with the century-long legacy of oil and gas exploration and production. Such pathways also have large uncertainties associated with their properties.
Therefore, inclusion of leakage in the storage analysis requires resolution of multiple scales, and incorporation of large uncertainties.
Taken together, these render standard numerical simulators ineffective due to their excessive computational demands. A series of simplifications to the governing equations can reduce computational demands, and ultimately render the system solvable by analytical or semi-analytical methods. These solutions, while restrictive in their assumptions, allow for large-scale analysis of leakage in a probabilistic framework. An example from Alberta, Canada will be used to demonstrate the utility of these solutions.Location: Olin Hall of Engineering (OHE) - 122
Audiences: Everyone Is Invited
Contact: Petra Pearce Sapir
-
Wireless Ad-Hoc Networks: From Probability to Physics via Information Theory
Thu, Jan 17, 2008 @ 01:30 PM - 02:30 PM
Ming Hsieh Department of Electrical and Computer Engineering
Conferences, Lectures, & Seminars
SPEAKER: Professor Massimo Franceschetti,
University of California, San DiegoABSTRACT: In this interdisciplinary talk we consider the problem of determining the information capacity of a network of wireless transmitters and receivers and try to draw some non-trivial connections between spatial stochastic processes, physics, and information theory.We present the following main result of statistical physics flavor: By distributing uniformly at random an order of n nodes wishing to establish pair-wise independent communications inside a domain of size of the order of n, the per-node information rate must follow an inverse square-root of n law, as n tends to infinity.The above claim originally due --in slightly different form-- to Gupta and Kumar (2000), requires both the construction of a network
operation scheme that achieves the required rate, and an information-theoretic proof of the optimality of such a scheme, at least in the scaling limit sense.We present a scheme due to Franceschetti, Dousse, Tse, and Thiran (2007) which relies on the theory of percolation and achieves the
inverse square-root of n law. Then, departing from the traditional information-theoretic approach of postulating fading channel and path loss models, we apply directly Maxwell's physics of wave propagation in conjunction to Shannon's theory of information, to obtain the "natural" upper bound on the scaling limit of the per-node rate and show that the inverse square-root of n bound is tight. This is a recent result of Franceschetti, Migliore, and Minero (2007).The conclusion is that claims (abundant in the literature) of surpassing the inverse square-root of n law, are artifacts of unrealistic channel modeling assumptions that hide the natural spatial constraints revealed by the Maxwell-Shannon approach.Host: Prof. Urbashi Mitra, ubli@usc.eduLocation: Hughes Aircraft Electrical Engineering Center (EEB) - 248
Audiences: Everyone Is Invited
Contact: Mayumi Thrasher
-
Uncertainty and Bayesian inference in inverse problems
Thu, Jan 17, 2008 @ 02:00 PM - 03:00 PM
Sonny Astani Department of Civil and Environmental Engineering
Conferences, Lectures, & Seminars
Speaker: Dr. Youssef Marzouk, Massachusetts Institute of Technology"Uncertainty and Bayesian inference in inverse problems"Predictive simulation rests on validated models, which often must be conditioned on indirect observations. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the objects of inference. Inverse problemsâ"representing indirect estimation of model parameters, inputs, or structural componentsâ"can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high dimensionality, as when the unknown is a spatial field.We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also pursue dimensionality reduction for the inference of spatiotemporal fields, using truncated Karhunen-Loève representations of Gaussian process priors. These approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous transport properties.
Location: Kaprielian Hall (KAP) - 209
Audiences: Everyone Is Invited
Contact: Evangeline Reyes
-
CS Colloquia: Network Resilience to Attack and Disaster
Thu, Jan 17, 2008 @ 03:30 PM - 05:00 PM
Thomas Lord Department of Computer Science
Conferences, Lectures, & Seminars
Title: Network Resilience to Attack and DisasterSpeaker: Prof. Dan Rubenstein (Columbia)ABSTRACT:
Traditional network design can compensate for a small number of node
and link failures, but cannot handle attacks or failures on a massive
scale. These massive-scale phenomena may be due to malicious behavior
in the network, such as a denial of service attack, or due to
disaster, such as an emergency sensor network deployed in a
catastrophic location such as a fire or flood. A primary focus of our
research has been to design or enhance routing protocols so that they
are more resilient to these massive-scale challenges. The talk will
first cover the Secure Overlay Services (SOS) architecture we proposed
that utilizes network overlays to proactively protect targeted
Internet sites from distributed denial of service (DDoS) attacks.
Next, we will explore the problem of maximizing the amount of data
that can be extracted to a base-station from a sensor network whose
nodes are undergoing rapid failures. We develop a novel distributed
network coding technique and demonstrate how, in a massive failure
setting, our coding/routing technique outperforms prior state-of-the-art. I
will finish the talk with a brief run-through of other projects that
our lab has focused on.BIO:
Dan Rubenstein is an Associate Professor of Electrical Engineering and
Computer Science at Columbia University. He received a B.S. degree in
mathematics from M.I.T., an M.A. in math from UCLA, and a PhD in computer
science from University of Massachusetts, Amherst. His research interests are
in network technologies, applications, and performance analysis, with a
substantial emphasis on resilient and secure networking, distributed
communication algorithms, and overlay technologies. He has received an NSF
CAREER Award, an IBM Faculty Award, the Best Student Paper award from the ACM
SIGMETRICS 2000 conference, and a Best Paper award from the IEEE ICNP 2003
Conference.Location: Seaver Science Library (SSL) - 150
Audiences: Everyone Is Invited
Contact: CS Colloquia