Orateur
Description
Many important optimization problems do not exhibit the convexity feature that would allow them to be solved to global optimality using classical convex optimization methods. In notable cases, such as with neural networks, we even fail to see what structure could be exploited to ensure global optimality. However, a wide variety of smooth nonlinear optimization problems can often be reformulated as polynomial optimization problems. Ignoring this important polynomial structure, these problems are often solved by local solvers such as Ipopt, even if they can only guarantee local optimality. On the other hand, the Macaulay or Sum-of-Squares (aka Lasserre) hierarchy converges to the global optimal. However, while the problem at each level of the hierarchy is solved in polynomial time, the size of this problem increases exponentially with the hierarchy. The ability of local solvers to yield locally optimal solution is therefore preferred when reaching the convergence of the hierarchy is computationally too expensive. In this presentation, we show that a hierarchy leveraging both the Macaulay and the Sum-of-Squares approaches can converge at a lower level. Moreover, we show that feasible solutions can be obtained via rounding algorithms at any level of the hierarchy. This allows the methods to be considered for larger problems as well, for which the convergence of the hierarchy may be expensive.