Breaking the dimension dependence in distributed learning
Shrivastava, Mayank
This item's files can only be accessed by the System Administrators group.
Permalink
https://hdl.handle.net/2142/124716
Description
Title
Breaking the dimension dependence in distributed learning
Author(s)
Shrivastava, Mayank
Issue Date
2024-04-30
Director of Research (if dissertation) or Advisor (if thesis)
Banerjee, Arindam
Department of Study
Computer Science
Discipline
Computer Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
Distributed Learning
Deep Learning
Sketching
Overparametrized Models
Machine Learning
Abstract
The high communication cost between the server and the clients is a significant bottleneck in scaling distributed learning for modern overparameterized deep models. One popular approach to reduce this cost is linear sketching, where the sender projects the updates into a lower dimension before communication, and the receiver desketches before any subsequent computation. While sketched distributed learning is known to scale effectively in practice, existing theoretical analyses suggest that the convergence error depends on the ambient dimension — impacting scalability. This thesis aims to shed light on this apparent mismatch between theory and practice. Our main result is a tighter analysis that eliminates
the dimension dependence in sketching without imposing unrealistic restrictive assumptions in the distributed learning setup. With the approximate restricted strong smoothness property of overparameterized deep models and using the second-order geometry of the loss, we present optimization results for the single-local step and K-local step distributed learning and subsequent bounds on communication complexity, with implications for analyzing and implementing distributed learning for overparameterized deep models.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.