KOŠTÁL, Lubomír and Ondřej POKORA. Nonparametric estimation of information-based measures of statistical dispersion. Entropy. 2012, vol. 14, No 7, p. 1221–1233. ISSN 1099-4300. Available from: https://dx.doi.org/10.3390/e14071221.
Other formats:   BibTeX LaTeX RIS
Basic information
Original name Nonparametric estimation of information-based measures of statistical dispersion
Authors KOŠTÁL, Lubomír and Ondřej POKORA.
Edition Entropy, 2012, 1099-4300.
Other information
Original language English
Type of outcome Article in a journal
Field of Study 10103 Statistics and probability
Country of publisher Switzerland
Confidentiality degree is not subject to a state or trade secret
Impact factor Impact factor: 1.347
Organization unit Faculty of Science
Doi http://dx.doi.org/10.3390/e14071221
UT WoS 000306748500007
Keywords in English statistical dispersion; entropy; Fisher information; nonparametric density estimation
Tags International impact, Reviewed
Changed by Changed by: Mgr. Ondřej Pokora, Ph.D., učo 42536. Changed: 13/3/2018 16:05.
Abstract
We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the "spread" or "variability" of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity.
PrintDisplayed: 18/10/2024 02:17