Final ACO Doctoral Examination and Defense of Dissertation of Kevin Shu: April 15, 2024

Final ACO Doctoral Examination and Defense of Dissertation

Algebraic Methods in Optimization

Kevin Shu
ACO PhD student, School of Mathematics

Date: April 15, 2024

Time: 1:00 PM EST

Location: Dissertation Defense Room - Price Gilbert 4222

Zoom: https://gatech.zoom.us/j/97001932651

Advisor:
Dr. Greg Blekherman, School of Mathematics, Georgia Institute of Technology

Committee:

Dr. Amir Ali Ahmadi, Dept. of Operations Research and Financial Engineering, Princeton University

Dr. Greg Blekherman, School of Mathematics, Georgia Institute of Technology

Dr. Santanu Dey, H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology

Dr. Mohit Singh, H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology

Dr. Shixuan Zhang, Industrial & Systems Engineering, Texas A&M University

Reader: Dr. Shixuan Zhang, Industrial & Systems Engineering, Texas A&M University

Thesis draft: http://kevinshu.me/thesis.pdf

Abstract: This thesis broadly concerns the usage of techniques from algebra, the study of higher order structures in mathematics, toward understanding difficult optimization problems. Of particular interest will be optimization problems related to systems of polynomial equations, algebraic invariants of topological spaces, and algebraic structures in convex optimization.

We will discuss various concrete examples of these kinds of problems. Firstly, we will describe new constructions for a class of polynomials known as hyperbolic polynomials which have connections to convex optimization. Secondly, we will describe how we can use ideas from algebraic geometry, notably the study of Stanley-Reisner varieties to study sparse structures in semidefinite programming. This will lead to quantitative bounds on some approximations for sparse problems and concrete connections to sparse linear regression and sparse PCA. Thirdly, we will use methods from algebraic topology to show that certain optimization problems on nonconvex topological spaces can be turned into convex problems due to a phenomenon known as `hidden convexity'. Specifically, we give a sufficient condition for the image of a topological space under a continuous map to be convex, and give a number of examples of this phenomenon with practical importance. This unifies and generalizes a number of existing results.
Finally, we will describe how to use techniques inspired by the sum-of-squares method to find new variants of gradient descent which converge faster than typical gradient descent on smooth convex problems.