Date of Award

8-2022

Document Type

Thesis

Degree Name

Master of Science (MS)

Department

Electrical Engineering

Committee Chair/Advisor

Yongqiang Wang

Committee Member

Yingjie Lao

Committee Member

Yongkai Wu

Abstract

Stepsizes for optimization problems play a crucial role in algorithm convergence, where the stepsize must undergo tedious manual tuning to obtain near-optimal convergence. Recently, an adaptive method for automating stepsizes was proposed for centralized optimization. However, this method is not directly applicable to decentralized optimization because it allows for heterogeneous agent stepsizes. Furthermore, directly using consensus between agent stepsizes to mitigate stepsize heterogeneity can decrease performance and even lead to divergence.

This thesis proposes an algorithm to remedy the tedious manual tuning of stepsizes in decentralized optimization. Our proposed algorithm automates the stepsize and uses dynamic consensus between agents’ stepsizes with a simple filter to reduce stepsize heterogeneity. Without using a simple filter, we show experimentally that consensus between agents can cause divergence due to rapid changes in the local stepsize. Furthermore, we support our algorithm with theoretical guarantees and experimental results. We present experiments on standard machine learning problems like logistic regression, matrix factorization (gradients are not globally Lipschitz), and CIFAR-10 image classification.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.