-
Epstein Institute Seminar - ISE 651
Tue, Sep 20, 2016 @ 03:30 PM - 04:50 PM
Daniel J. Epstein Department of Industrial and Systems Engineering
Conferences, Lectures, & Seminars
Speaker: Hongbo Dong, Assistant Professor - Dept. of Mathematics & Statistics - Washington State University
Talk Title: Exploiting Quadratic and Separable Structures in Nonconvex Quadratic Programs via Lift-and-Project
Abstract: In optimization, mixed-integer nonlinear programs MINLP are notoriously difficult to solve to global optimality. It is therefore crucial to exploit problem structures to design effective convex relaxations and/or approximation methods. Same claims hold even when all nonlinear terms are quadratic. We consider a generic sub-structure that comprises a quadratic form and separable (non-convex) constraints. We show how to derive convex relaxations for related non-convex sets in a higher-dimensional space by using conic semidefinite optimization techniques. Essentially by projecting such lifted relaxations back onto the original variable space, we discuss in two concrete scenarios where such lift-and-project techniques improve upon current relaxations, connect with techniques from other areas, and provide new insights. The first scenario concerns generating convex quadratic cutting surfaces to iteratively strengthen classical convex relaxations for mixed-integer quadratic programs. A specialized separation routine (based on coordinate minimization) is developed to avoid (fully) solving semidefinite programs. Our proposed method achieves a more balanced trade-off between strength and computational complexity than existing relaxations, and can be easily incorporated into branch-and-bound algorithms for MINLP. The second scenario concerns the well-known problem of variable selection in statistics and machine learning. We show that lift-and-project methods tightly connect with (folded) concave regularization functions called the Minimax Concave Penalty (MCP) from the statistical community. Our lifting relaxation provides a very different convex relaxation from classical ones (LASSO or l-1 norm) while providing competitive practical performance in certain scenarios
Biography: Hongbo Dong received his Ph.D. in Applied Mathematical and Computational Sciences at the University of Iowa in 2011. After spending two years as a postdoc in a multi-disciplinary optimization group at the University of Wisconsin-Madison, he joined the math department of Washington State University as an Assistant Professor in 2013. His previous research focused on copositive programming, convex relaxations for non-convex problems. Recently he is interested in developing and analyzing novel convex and non-convex formulations for problems in statistics and machine learning. His research results have been published on several optimization and statistical journals including Mathematical Programming, SIAM Journal on Optimization and Biometrika.
Host: Dr. Jong-Shi Pang
Location: 206
Audiences: Everyone Is Invited
Contact: Angela Reneau