Optimal entropy estimation on large alphabets via best polynomial approximation
- Resource Type
- Conference
- Authors
- Wu, Yihong; Yang, Pengkun
- Source
- 2015 IEEE International Symposium on Information Theory (ISIT) Information Theory (ISIT), 2015 IEEE International Symposium on. :824-828 Jun, 2015
- Subject
- Communication, Networking and Broadcast Technologies
Signal Processing and Analysis
Entropy
Estimation
Polynomials
Approximation methods
Nickel
Random variables
large alphabet
high-dimensional statistics
entropy estimation
best polynomial approximation
functional estimation
- Language
- ISSN
- 2157-8095
2157-8117
Consider the problem of estimating the Shannon entropy of a distribution on k elements from n independent samples. We show that the minimax mean-square error is within universal multiplicative constant factors of k over n log k + log 2 k over n. This implies the recent result of Valiant-Valiant [1] that the minimal sample size for consistent entropy estimation scales according to Θ(k over log k). The apparatus of best polynomial approximation plays a key role in both the minimax lower bound and the construction of optimal estimators.