This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/20798
Description
Title
Density estimation with Kullback-Leibler loss
Author(s)
Sheu, Chyong-Hwa
Issue Date
1990
Doctoral Committee Chair(s)
Barron, Andrew
Department of Study
Statistics
Discipline
Statistics
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Date of Ingest
2011-05-07T12:49:35Z
Keyword(s)
Statistics
Language
eng
Abstract
Probability density functions are estimated by the method of maximum likelihood in sequences of regular exponential families. The approximation families of log-densities that we consider are polynomials, splines, and trigonometric series. Bounds on the relative entropy (Kullback-Leibler number) between the true density and the estimator are obtained and rates of convergence are established for log-density functions assumed to have square integrable derivatives. The relative entropy risk between true probability density function and the estimator is shown to converge to zero at a desired rate. The idea is to select n samples from the true distribution and choose the estimator which is the maximum posterior likelihood estimator in certain regular m-parameter exponential families, given that a Gaussian distribution is the prior on the parameter space. The implications for universal source coding and portfolio selection are discussed.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.