Withdraw
Loading…
Optimization over networks: Efficient algorithms and analysis
Lee, Soomin
Loading…
Permalink
https://hdl.handle.net/2142/44389
Description
- Title
- Optimization over networks: Efficient algorithms and analysis
- Author(s)
- Lee, Soomin
- Issue Date
- 2013-05-24T22:10:00Z
- Director of Research (if dissertation) or Advisor (if thesis)
- Nedich, Angelia
- Doctoral Committee Chair(s)
- Nedich, Angelia
- Committee Member(s)
- Roth, Dan
- Milenkovic, Olgica
- Veeravalli, Venugopal V.
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- Distributed optimization
- Random projections
- Large-scale optimization
- Machine learning
- Support vector machines
- Convergence analysis
- Random gossip networks
- Consensus
- Multi-agent systems
- Abstract
- A number of important problems that arise in various application domains can be formulated as a distributed convex constrained minimization problem over a multi-agent network. The problem is usually defined as a sum of convex objective functions over an intersection of convex constraint sets. The first part of this thesis is focused on the development and analysis of efficient distributed algorithms for a constrained convex optimization problem over a multi-agent network where each agent has its own objective function and constraint set. We propose gradient descent algorithms with random projections which use various communication protocols. First, we present a distributed random projection (DRP) algorithm whereby each agent exchanges local information only with its immediate neighbors at each iteration. With reasonable assumptions, we prove that the iterates of all agents converge to the same point in the optimal set with probability 1. In addition, we consider a variant of the method that uses a mini-batch of consecutive random projections and establish its convergence. Experiments on distributed support vector machines demonstrate fast convergence of the DRP algorithm. It actually shows that the number of iterations required for convergence is much smaller than that for scanning over all training samples just once. Second, we propose an asynchronous gossip-based random projection (GRP) algorithm that solves the distributed problem using gossip type communications and local computations. We analyze the convergence properties of the algorithm for an uncoordinated diminishing stepsize and a constant stepsize. For a diminishing stepsize, we prove that the iterates of all agents converge to the same optimal point with probability 1. For a constant stepsize, we establish an error bound on the expected distance from the optimal point to the iterates of the algorithm. In addition, we consider a variant of the method that uses a mini-batch of consecutive random projections and, also, establish its convergence. Furthermore, we provide simulation results on a distributed robust model predictive control problem. In the second part of the thesis, we discuss an efficient epoch gradient descent algorithm for obtaining fast and exact solutions of linear support vector machines (SVMs). SVMs penalized with the popular hinge-loss are strongly convex but they do not have Lipschitz continuous gradient. We find SVMs that have both strong-convexity and Lipschitz continuous gradient using a smooth approximation technique.
- Graduation Semester
- 2013-05
- Permalink
- http://hdl.handle.net/2142/44389
- Copyright and License Information
- Copyright 2013 Soomin Lee
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisDissertations and Theses - Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer EngineeringManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…