### Contents

Back to Continuous Optimization

## Basic Concepts

** Semi-infinite programming (SIP)** problems are optimization problems in which there is an infinite number of variables or an infinite number of constraints (but not both). A general SIP problem can be formulated as

\[ \begin{array}{lll}

(P) & \min_{x} & f(x) \\

& \mbox{subject to} & g(x,t) \geq 0\, \forall t \in T,

\end{array} \]

where \(x = (x_1, \cdots, x_n) \in \mathbb{R}^n\), \(T\) is an infinite set, and all the functions are real-valued.

A SIP problem is called *linear* whenever the *objective function* \(f\) is linear and the constraint functions \(g(t), \, t \in T\) are affine; it is called *convex* whenever \(f\) is convex and \(g(.\,,t)\) is concave for all \(t \in T\), in which case the *feasible set*

\[F = \{x \in \mathbb{R}^n\, :\, g(x,t) \geq 0\, \text{for all}\, t \in T \} \]

is convex and any local minimum is a global minimum too. We say that \(P\) is *continuous* whenever the index set \(T\) is a compact Hausdorff topological space and \(g(x,\,t)\) is a continuous function of \(t\) on \(T\) for any \(x \in \mathbb{R}^n\). In the particular case of linear SIP, \(g ( x ,\, t)\) can be written as

\[g (x,t) = \sum_{i=1,\dots,n} a_{i}(t)x_{i}-b(t)\]

with \(a_{i} : T \rightarrow \mathbb{R}, \,\,i=1,...,n,\) and \(b : T \rightarrow \mathbb{R}\); then \(P\) is continuous if and only if \(T\) is a compact Hausdorff topological space and the functions \(a_{1},...,a_{n},\, b\) are continuous on \(T\).

In *generalized* SIP, the index set depends on the decision (or state) variable \(x\), i.e., the problem can be formulated as

\[ \begin{array}{lll}

(P) & \min_{x} & f(x) \\

& \mbox{s.t.} & g(x,t) \geq 0 \, \forall t \in T(x),

\end{array} \]

where \(T\) represents here a set-valued mapping. In generalized SIP, the feasible set \(F\) may be nonconvex and nonclosed (e.g., the union of open and closed halfspaces), and may have re-entrant corners, even in the linear case.

The different dual problems that can be associated with \(P\) in order to get stopping rules for the algorithms are also semi-infinite in the opposite sense, i.e., they have finitely many constraints and infinitely many variables (in linear SIP, one dual variable for each \(t \in T\). It is worth observing that, in contrast to linear proramming, in linear and convex SIP, the finiteness of the optimal value

\[\nu(P) := \inf \{f(x) : x \in F \} \]

does not imply the existence of an optimal solution; even more, the simultaneous finiteness of the optimal value of \(P\) and one of its dual problems does not imply their coincidence, i.e., duality gaps could exist. For these and other reasons, SIP problems are much harder to analyze and solve than finite (for the number of unknowns and constraints) mathematical programming problems. SIP is not only important due to its many potential applications, which are commonly published in non mathematical journals, but also for the many theoretical challenges it poses, to the point that it is seen as a laboratory for advanced optimization concepts and techniques. Indeed, a low proportion of the vast literature on SIP (MathSciNet records more than 800 documents dealing with either "semi-infinite programming" or "semi-infinite optimization") are devoted to real applications, whereas most documents deal with numerical methods and, more frequently, with some of the following theoretical issues (among others):

- Optimality theorems: the characterization of local and global optima.
- Duality theorems: the search of conditions guaranteeing a zero duality gap.
- Geometry of the feasible set: the study of its facial structure (in convex SIP) or its closedness (in generalized SIP).
- Stability analysis: the continuity properties (e.g., semicontinuity in the sense of Berge) of the feasible set and the optimal set, and the computation of bounds for the variation of the optimal value (Lipschitz modulus), the feasible set and the optimal set (Lipschitz-like or regularity modulus) under small perturbations of the data due to the inherent inaccuracy of the data in \(P\) or rounding off errors occurring during the computation process.
- Sensitivity (or perturbation) analysis: the estimation of the impact on the optimal value under small perturbations of the data, which involves directional differentiability concepts.
- Parametric analysis: the geometry of sets of optimal solutions (or points satisfying the Karush-Kuhn-Tucker condition) when some data in \(P\) depend on a parameter and the perturbed problems possess a unique optimal solution (or a unique KKT point, respectively) for any possible value of the parameter; in the case of one dimensional parameters, the mentioned sets can be seen as trajectories described by the optimal solution (or by the KKT points, respectively). Some authors consider synonymous the terms 'parametric' and 'stability' due to the fact that both are referred to perturbations of a given problem \(P\); nevertheless, we prefer to distinguish them because the corresponding analyses treat the perturbations of \(P\) under topological and geometrical perspectives, respectively, making use of very different tools (as Urisohn' Lemma or Morse theory, respectively).

## Algorithms

There are two main difficulties in the numerical treatment of the SIP problem:

- checking the feasibility of a given \(\overline{x}\in \mathbb{R}^{n}\) requires computing the optimal value, \(\nu (Q(\overline{x}))\), of the so-called
*lower level problem*at \(\overline{x}\),

\[Q(\overline{x})\,:\,min_{t}\, g(\overline{x},t)\,\,\,\text{s.t.}\,\,\, t\in T,\]

which is a global optimization problem. Even more, some algorithms require the computation of all the global minima of \(Q(\overline{x})\), and this is only possible under strong assumptions on \(g\) and \(T\), e.g., that \(g(\overline{x},t)\) is a polynomial function of \(t\) and \(T\) is a finite dimensional interval. - possibility that all constraints are inactive at any feasible solution, i.e., the set of active constraints at \(\overline{x}\),

\[T_{a}(x)=\{t\in T\,:\,g(\overline{x},t)=0\},\]

could be empty for all \(x\in F\) (even on the boundary of \(F\)). To avoid this undesirable situation, constraint qualifications are needed.

Below, we describe briefly four of these approaches whose stopping rules generally provide infeasible points. Methods generating feasible iterates for linear, nonlinear, and minmax SIP problems are proposed in León et al. (2000), Floudas and Stein (2007), and Polak (1987), respectively.

## Online and Software Resources

- Semi-infinite Optimization Solvers available on the NEOS Server
- Semi-infinite Programming survey paper by Marco Lopez and George Still
- References in Semi-infinite Optimization collected by Marco Lopez and George Still

## References

- Betro, B. 2004. An accelerated central cutting plane algorithm for linear semi-infinite linear programming.
*Mathematical Programming, Series A***101**, 479 - 495. - Floudas C.A. and Stein O. 2007. The adaptive convexification algorithm: a feasible point method for semi-infinite programming.
*SIAM Optimization***18**: 1187-1208. - Glashoff, K. and Gustafson, S.-A. 1983.
*Linear Optimization and Approximation*. Springer Verlag, Berlin. - Goberna, M. A. and Lopez, M. A. 1998.
*Linear Semi-Infinite Optimization*. John Wiley, Chichester. - Gustafson, S.-A. 1979. On semi-infinite programming in numerical analysis.
*Semi-infinite Programming*. Springer Verlag, Berlin, pp. 137-153. - Hettich, R. 1986. An implementation of a discretization method for semi-infinite programming.
*Mathematical Programming***34**: 354-361. - Hettich R. and Kortanek, K.O. 1993. Semi-infinite programming: Theory, methods and applications.
*SIAM Review***35**: 380-429. - Hettich R. and Zencke P. 1982.
*Numerische Methoden der Approximation und der semi-infiniten Optimierung*(*On approximations and nonlinear semi-infinite programming*). Teubner, Stuttgart. - León T., Sanmatías S., and Vercher, E. 2000. On the numerical treatment of linearly constrained semi-infinite optimization problems.
*European Journal of Operational Research***121**: 78-91. - López, M.A. and Still, G. 2007. Semi-infinite programming.
*European Journal of Operational Research***180**: 491-518. - Optimization Online Infinite Dimensional Optimization area
- Polak E. 1987. On the mathematical foundation of nondifferentiable optimization in engineering design.
*SIAM Review***29**: 21-89. - Reemtsen, R. and Görner, S. 1998. Numerical methods for semi-infinite programming: A survey.
*Semi-infinite Programming*. Kluwer, Boston, pp. 195-275. - Stein O. 2003.
*Bilevel Strategies in Semi-infinite Programming*. Kluwer, Boston.