|
||||||||||
PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES |
Class Summary | |
---|---|
JensenShannonDivergence | Implementation of the JensenShannonDivergence.JensenShannonDivergence() based on the remarks
in Divergence measures based on the Shannon
entropy by Jianhua Lin. |
KullbackLeiblerDivergence | Implementation of the KullbackLeiblerDivergence.KullbackLeiblerDivergence() based on the
remarks made by Jianhua Lin in Divergence measures based on the Shannon
entropy. |
PointwiseKullbackLeiblerDivergence | Pointwise Kullback-Leibler Divergence for a term x is defined as the expected loss of information of the entire probability distribution. |
|
||||||||||
PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES |