Withdraw
Loading…
Information theory meets big data: Theory, algorithms and applications to deep learning
Gao, Weihao
Loading…
Permalink
https://hdl.handle.net/2142/106145
Description
- Title
- Information theory meets big data: Theory, algorithms and applications to deep learning
- Author(s)
- Gao, Weihao
- Issue Date
- 2019-08-19
- Director of Research (if dissertation) or Advisor (if thesis)
- Viswanath, Pramod
- Doctoral Committee Chair(s)
- Viswanath, Pramod
- Committee Member(s)
- Raginsky, Maxim
- Oh, Sewoong
- Kannan, Sreeram
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- Information Theory
- Property Estimation
- Deep Learning
- Abstract
- As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, discovering and understanding the underlying relationship among different aspects of data is the core problem in information theory. However, traditional information theory research focuses on solving this problem in an abstract population-level way. In order to apply information-theoretic tools to real-world problems, it is necessary to revisit information theory from sample-level. One important bridge between traditional information theory and real-world problems is the information-theoretic quantity estimators. These estimators enable computing of traditional information-theoretic quantities from big data and understanding hidden relationships in data. Information-theoretic tools can also be utilized to improve modern machine learning techniques. In this dissertation, several problems of information-theoretic quantity estimators and their applications are investigated. This dissertation consists of the following topics: (1) theoretical study of the fundamental limit of information-theoretic quantity estimators, especially k-nearest neighbor estimators of differential entropy and mutual information; (2) designing novel algorithms of differential entropy and mutual information estimators for some special and challenging practical scenarios, as well as new information-theoretic measures to discover complex relationships among data which cannot be found by traditional measures; (3) applying information-theoretic tools to improve training algorithms and model compression algorithms in deep learning.
- Graduation Semester
- 2019-12
- Type of Resource
- text
- Permalink
- http://hdl.handle.net/2142/106145
- Copyright and License Information
- Copyright 2019 Weihao Gao
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisDissertations and Theses - Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer EngineeringManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…