Modern methods for solving convex optimization problems
Roman Kukumberg
PhD thesis advisors Margaréta Halická, PhD thesis consultant: Mária Trnovská


Download
PhD thesis - Full text


Abstract: The dissertation thesis deals with modern methods for solving convex optimization problems, more precisely large-scale unconstrained optimization problems with a special structure, where the objective function can be split into a convex differentiable and non-differentiable part. We mainly deal with the so-called l1-regularized problems that have a wide application in practice. The most important representatives of these problems are the l1 -regularized least squares and the l1 -regularized logistic regression, in which we made an overview of applications and solution methods. Modern methods include proximal methods and interior-point methods, which we examine in more detail, propose modifications and numerical improvements. We propose a hybrid method that combines the algorithm of the primal-dual interior-point method with the proximal- gradient method. We also discuss how to implement methods for solving l1 -regularized problems. We perform extensive numerical experiments and comparison of methods on generated data under different conditions. For example, we test the effect of problem size, data sparsity, or the effect of the regularization parameter λ on the performance of methods. We also apply selected methods to solve real problems from various areas such as optimal topology design, signal processing, medicine, text categorization.


References
[1] R. Kukumberg: Two approaches for solving l1-regularized least squares with application to truss topology design, Acta Mathematica Universitatis Comenianae-New Series . - Vol. 84, No. 2 (2015), s. 297-308.