Withdraw
Loading…
Machine learning in space forms: Embeddings, classification, and similarity comparisons
Tabaghi, Puoya
Loading…
Permalink
https://hdl.handle.net/2142/113832
Description
- Title
- Machine learning in space forms: Embeddings, classification, and similarity comparisons
- Author(s)
- Tabaghi, Puoya
- Issue Date
- 2021-11-30
- Director of Research (if dissertation) or Advisor (if thesis)
- Dokmanic, Ivan
- Doctoral Committee Chair(s)
- Dokmanic, Ivan
- Committee Member(s)
- Milenkovic, Olgica
- Hajek, Bruce
- Raginsky, Maxim
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- Embedding
- Space Forms
- Non-Euclidean Classification
- Similarity Comparisons
- Abstract
- We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding, classification, and similarity comparisons. We first introduce kinetic Euclidean distance matrices to solve kinetic distance geometry problems. In distance geometry problems (DGPs), the task is to find a geometric representation, that is, an embedding, for a collection of entities consistent with pairwise distance (metric) or similarity (nonmetric) measurements. In kinetic DGPs, the twist is that the points are dynamic. And our goal is to localize them by exploiting the information about their trajectory class. We show that a semidefinite relaxation can reconstruct trajectories from incomplete, noisy, time-varying distance observations. We then introduce another distance-geometric object: hyperbolic distance matrices. Recent works have focused on hyperbolic embedding methods for low-distortion embedding of distance measurements associated with hierarchical data. We derive a semidefinite relaxation to estimate the missing distance measurements and denoise them. Further, we formalize the hyperbolic Procrustes analysis, which uses extraneous information in the form of anchor points, to uniquely identify the embedded points. Next, we address the design of learning algorithms in mixed-curvature spaces. Learning algorithms in low-dimensional mixed-curvature spaces have been limited to certain non-Euclidean neural networks. Here, we study the problem of learning a linear classifier (a perceptron) in product of Euclidean, spherical, and hyperbolic spaces, i.e., space forms. We introduce a notion of linear separation surfaces in Riemannian manifolds and use a metric that renders distances in different space forms compatible with each other and integrates them into one classifier. Lastly, we show how similarity comparisons carry information about the underlying space of geometric graphs. We introduce the ordinal spread of a distance list and relate it to the ordinal capacity of their underlying space, a notion that quantifies the space's ability to host extreme patterns in nonmetric measurements. Then, we use the distribution of random ordinal spread variables as a practical tool to identify the underlying space form.
- Graduation Semester
- 2021-12
- Type of Resource
- Thesis
- Permalink
- http://hdl.handle.net/2142/113832
- Copyright and License Information
- Copyright 2021 Puoya Tabaghi
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…