## All Theses

May 2020

Thesis

#### Degree Name

Master of Science (MS)

#### Department

Mathematical Sciences

Yuyuan Ouyang

Boshi Yang

#### Committee Member

Cristopher McMahan

#### Abstract

In this thesis our goal is to solve the dual problem of the support vector machine (SVM) problem, which is an example of convex smooth optimization problem over a polytope. To this goal, we apply the conditional gradient (CG) method by providing explicit solution to the linear programming (LP) subproblem. We also describe the conditional gradient sliding (CGS) method that can be considered as an improvement of CG in terms of number of gradient evaluations. Even though CGS performs better than CG in terms of optimal complexity bounds, it is not a practical method because it requires the knowledge of the Lipschitz constant and also the number of iterations. As an improvement of CGS, we designed a new method, conditional gradient sliding with line search (CGS-ls) that resolves the issues in CGS method. CGS-ls requires $O(1/\sqrt{1/\epsilon})$ gradient evaluations and $O(1/\epsilon)$ linear optimization calls that achieves the optimal complexity bounds in CGS method. We also compare the performance of our method with CG and CGS methods as numerical results by experimenting them in dual problem of SVM for binary classification of two subsets of the MNIST hand-written digits dataset.

COinS