J. Lin. IEEE Transactions on Information Theory, 37 (1):
145-151(January 1991)
DOI: 10.1109/18.61115
Abstract
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness
Description
IEEE Xplore Abstract - Divergence measures based on the Shannon entropy
%0 Journal Article
%1 lin1991divergence
%A Lin, Jianhua
%D 1991
%J IEEE Transactions on Information Theory
%K distribution divergence entropy jensen jensen-shannon js probability shannon
%N 1
%P 145-151
%R 10.1109/18.61115
%T Divergence measures based on the Shannon entropy
%U http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=61115
%V 37
%X A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness
@article{lin1991divergence,
abstract = {A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness},
added-at = {2016-04-10T14:35:08.000+0200},
author = {Lin, Jianhua},
biburl = {https://www.bibsonomy.org/bibtex/266e1e249efddb262bf58214d6a21f120/sdo},
description = {IEEE Xplore Abstract - Divergence measures based on the Shannon entropy},
doi = {10.1109/18.61115},
interhash = {3a63843545ff9a49bf2864635588b55a},
intrahash = {66e1e249efddb262bf58214d6a21f120},
issn = {0018-9448},
journal = {IEEE Transactions on Information Theory},
keywords = {distribution divergence entropy jensen jensen-shannon js probability shannon},
month = jan,
number = 1,
pages = {145-151},
timestamp = {2016-04-10T14:35:08.000+0200},
title = {Divergence measures based on the Shannon entropy},
url = {http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=61115},
volume = 37,
year = 1991
}