As the size of the search space is exponential in n, the number of arguments of an objective function with an unknow and possibly large number of local minima, determining the global optimum depends critically on n.

In this lecture i observe the formal similarity between integration and optimization.

Riemann integration depends, like optimization algorithms, on subdivisions in argument space. Numerical algorithms based on Riemann integration therefore have search spaces exponential in n.

Lebesgue integration depends on subdivision in function-value space. Searching this space is does not depend on n.

The formal similarity between integration and optimization suggests what we call "Lebesgue optimization", which is a global optimization algorithm that relies primarily on subdivision in objective function space. As a result it is probably the best we can do to suppress the tendency for nonconvex global optimization algorithms to become rapidly unpractical with increasing n.