Withdraw
Loading…
Mismatched divergence and universal hypothesis testing
Huang, Dayu
Loading…
Permalink
https://hdl.handle.net/2142/14758
Description
- Title
- Mismatched divergence and universal hypothesis testing
- Author(s)
- Huang, Dayu
- Issue Date
- 2010-01-06T17:50:05Z
- Director of Research (if dissertation) or Advisor (if thesis)
- Meyn, Sean P.
- Doctoral Committee Chair(s)
- Meyn, Sean P.
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- M.S.
- Degree Level
- Thesis
- Keyword(s)
- Kullback-Leibler divergence
- Hoeffding test
- Pinsker's inequality
- universal hypothesis testing
- robust test
- mismatched universal test
- mismatched divergence
- detection
- learning
- variance
- variational representation
- Abstract
- An important challenge in detection theory is that the size of the state space may be very large. In the context of universal hypothesis testing, two important problems pertaining to the large state space that have not been addressed before are: (1) What is the impact of a large state space on the performance of tests? (2) How does one design an effective test when the state space is large? This thesis addresses these two problems by developing a generalization of Kullback-Leibler (KL) mismatched divergence, called mismatched divergence. 1. We describe a drawback of the Hoeffding test: The asymptotic bias and variance of the Hoeffding test are approximately proportional to the size of the state space; thus, it performs poorly when the number of test samples is comparable to the size of state space. 2. We develop a generalization of the Hoeffding test based on the mismatched divergence, called the mismatched universal test. We show that this test has asymptotic bias and variance proportional to the dimension of the function class used to define the mismatched divergence. The dimension of the function class can be chosen to be much smaller than the size of the state space, and thus our proposed test has a better finite-sample performance in terms of bias and variance. 3. We demonstrate that the mismatched universal test also has an advantage when the distribution of the null hypothesis is learned from data. 4. We develop some algebraic properties and geometric interpretations of the mismatched divergence. We also show its connection to a robust test. 5. We develop a generalization of Pinsker’s inequality, which gives a lower bound of the mismatched divergence.
- Graduation Semester
- 2009-12
- Permalink
- http://hdl.handle.net/2142/14758
- Copyright and License Information
- Copyright 2009 Dayu Huang
Owning Collections
Dissertations and Theses - Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer EngineeringGraduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…