In the modern computerized age, nonconvex optimization remains a critical computational challenge. Nonconvexity of an optimization problem implies a combinatorial structure, which often makes the computation problem fundamentally hard. Efficient computation tools with global approximation guarantees are in high demand. The principal investigator will study a class of nonconvex optimization problems that naturally arise from distributed intelligence systems, sparse estimation and data analysis. The proposed research will contribute new computation tools for data analytics, statistic and machine learning, distributed and parallel computing, and multi-agent intelligence systems. The project will also develop two new courses for both undergraduate and graduate students at Princeton and also will involve undergraduate students in the research project via the Princeton undergraduate summer research program.
This research project aims to tackle an important class of nonconvex problems utilizing their geometric structure via a systematic dualization approach. The result is expected to advance the non-convex optimization theory as well as to provide algorithmic solutions to a large variety of distributed systems. Specifically, the principal investigator plans to study the non-convex duality for a class of non-convex optimization problems that admit a near-separable structure, with extensions to minimax problems and variational inequalities, and to develop computation tools that produce approximate global optimal solutions with complexity guarantees. In addition to the fundamental aspects, the principal investigator aims to investigate practical algorithms tailored to specific problems in high-dimensional structural estimation, sparse learning, and distributed optimization. The theoretical results and new methodology are expected to advance the theory of non-convex optimization as well as to provide algorithmic solutions to a variety of computational challenges.