Abstract

This thesis is about mathematical optimization. Mathematical optimization involves the construction of methods to solve optimization problems, which can arise from real-life problems in applied science, when they are mathematically modeled. Examples come from electrical design, engineering, control theory, telecommunication, environment, finance, and logistics.
This thesis deals especially with semidefinite optimization
... read more
problems. Semidefinite programming is a large class of convex optimization problems with convex constraints and these constraints take the form of linear matrix inequalities, where the matrix variables lie in a semidefinite cone. The inequalities induce a bound on the eigenvalues of sums of symmetric or hermitian matrices. Eigenvalues will play an important role if we reformulate the semidefinite optimization problem. Optimization has been used for centuries, but since the 1950 s, at the beginning of the computer technology, it has become more famous. Computers have made it possible to apply algorithms to real life applications, which usually involve a large number of variables and or relations between those variables. The size of the problem can be in the order of thousands or even millions of variables. In certain applications the computations must be completed in a fraction of a second, so apart from powerful computers, e±cient algorithms are also needed. In this thesis we will concentrate on the design of an e±cient solution method for solving semidefinite programming problems, applicable or extensible to large-scale convex programming problems. Methods that consume less computation time are needed, for example, subspace methods that already exist for solving linear equations, or eigenvalue problems.
Subspace methods project the data of a specific problem on a much smaller subproblem that can be solved very fast, and return a good approximation for the solution of the real problem. We do this by transforming the semidefinite programming problem to a convex non-smooth optimization problem by writing its constraint as an eigenvalue constraint, and we develop an algorithm that solves the semidefinite optimization problem for this formulation. The algorithm can be adjusted to apply subspace methods for eigenvalue problems. The newly designed algorithm is verified by some testing problems.
As already noted, the developed method uses a subspace procedure, which reduces the size of the matrix variables dramatically. As a consequence for the reduced problems, all computations can be done without restriction on the memory or computation time when setting the subspace size to a small number. The most expensive part of the newly designed algorithm is the computation of a small amount, say s, of eigenvalues every iteration. The value of s is considerably smaller than m, the amount of constraints, and n; the size of the matrix variable in the semidefinite programming problem. By using an appropriate iterative eigensolver for the computation of the eigenvalues we expect a large gain in the computing time and for memory requirements.
show less