Abstract
We consider the problem of minimizing a given $n$-variate polynomial $f$ over the hypercube $[-1,1]^n$. An idea introduced by Lasserre, is to find a probability distribution on $[-1,1]^n$ with polynomial density function $h$ (of given degree $r$) that minimizes the expectation $\int_{[-1,1]^n} f(x)h(x)d\mu(x)$, where $d\mu(x)$ is a fixed, finite Borel measure supported on $[-1,1]^n$. It is known that, for the Lebesgue measure $d\mu(x) = dx$, one may show an error bound $O(1/\sqrt{r})$ if $h$ is a sum-of-squares density, and an $O(1/r)$ error bound if $h$ is the density of a beta distribution. In this paper, we show an error bound of $O(1/r^2)$, if $d\mu(x) = \left( \prod_{i=1}^n \sqrt{1-x_i^2} \right)^{-1}$ (the well-known measure in the study of orthogonal polynomials), and $h$ has a Schmüdgen-type representation with respect to $[-1,1]^n$, which is a more general condition than a sum of squares. The convergence rate analysis relies on the theory of polynomial kernels and, in particular, on Jackson kernels. We also show that the resulting upper bounds may be computed as generalized eigenvalue problems, as is also the case for sum-of-squares densities.
Original language | English |
---|---|
Pages (from-to) | 346-367 |
Journal | SIAM Journal on Optimization |
Volume | 27 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2017 |
Keywords
- box-constrained global optimization
- polynomial optimization
- Jackson kernel
- semidefinite programming
- generalized eigenvalue problem
- sum-of-squares polynomial