de.aitools.ir.retrievalmodels.relevance.probabilistic
Class KullbackLeiblerDivergence

java.lang.Object
  extended by de.aitools.ir.retrievalmodels.relevance.probabilistic.KullbackLeiblerDivergence
All Implemented Interfaces:
RelevanceFunction<Vector,Vector>

public class KullbackLeiblerDivergence
extends java.lang.Object
implements RelevanceFunction<Vector,Vector>

Implementation of the KullbackLeiblerDivergence() based on the remarks made by Jianhua Lin in Divergence measures based on the Shannon entropy. The divergence is non-negative, additive and not symmetric. It can be used as a distance measure. Let p1 and p2 be two probability distribution. The KL-Divergence measures the number of bits lavished, if one uses the probability distribution p2 instead of p1.

The divergence is actually undefined, if p1(x) unequal 0 and p2(x) equals 0 for any element x. This implementation returns MAXIMUM_DISTANCE, if the divergence is undefined.

Version:
aitools 3.0 Created on Apr 18, 2010 $Id: KullbackLeiblerDivergence.java,v 1.1 2010/05/19 15:52:03 poma1006 Exp $
Author:
[email protected]

Field Summary
static double MAXIMUM_DISTANCE
           
 
Constructor Summary
KullbackLeiblerDivergence()
           
 
Method Summary
 double compute(Vector p1, Vector p2)
           
 
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

MAXIMUM_DISTANCE

public static final double MAXIMUM_DISTANCE
See Also:
Constant Field Values
Constructor Detail

KullbackLeiblerDivergence

public KullbackLeiblerDivergence()
Method Detail

compute

public double compute(Vector p1,
                      Vector p2)
Specified by:
compute in interface RelevanceFunction<Vector,Vector>