Date of Award


Document Type


Degree Name

Doctor of Philosophy (PhD)


Mathematical Sciences

Committee Chair/Advisor

Dr. Yuyuan Ouyang

Committee Member

Dr. Brian Dean

Committee Member

Dr. Yongjia Song

Committee Member

Dr. Boshi Yang


The primary concern of this thesis is to explore efficient first-order methods of computing approximate solutions to convex optimization problems. In recent years, these methods have become increasingly desirable as many problems in fields such as machine learning and imaging science have scaled tremendously. Our aim here is to acknowledge the capabilities of such methods and then propose new techniques that extend the reach or accelerate the performance of the existing state-of-the-art literature.

Our novel contributions are as follows. We first show that the popular Conditional Gradient Sliding (CGS) algorithm can be extended in application to objectives with H\"older continuous gradients. CGS has gained much attention in recent years due to its ability to compute an approximate solution without the necessity of a projection oracle. However, it requires both the existence and knowledge of certain smoothness parameters to properly run. We will relax the smoothness requirements and utilize a backtracking linesearch approach to facilitate the algorithm without the knowledge of the relaxed smoothness parameter. In doing so, we will design a new generalized CGS algorithm which also has additional practical benefits over CGS in the smooth case.

Chapter 4 moves our discussion to affinely constrained problems and their limitations. These methods can be solved by alternating direction method of multipliers (ADMM) and are quite popular in fields such as imaging science. We discuss the current lower complexity bounds of solving such problems and suggest a potential method of improvement. By separating the computation of gradient and operator evaluations, we propose two new sliding methods that improve upon the best known convergence rates for such affinely constrained problems. Furthermore, we show that by carefully combining our two new methods, we can obtain a single sliding ADMM method that attains a stronger lower complexity bound for this problem class.



To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.