de.aitools.ir.retrievalmodels.relevance.probabilistic
Class PointwiseKullbackLeiblerDivergence

java.lang.Object
  extended by de.aitools.ir.retrievalmodels.relevance.probabilistic.PointwiseKullbackLeiblerDivergence
All Implemented Interfaces:
RelevanceFunction<java.lang.Double,java.lang.Double>

public class PointwiseKullbackLeiblerDivergence
extends java.lang.Object
implements RelevanceFunction<java.lang.Double,java.lang.Double>

Pointwise Kullback-Leibler Divergence for a term x is defined as the expected loss of information of the entire probability distribution. In fact, it is just the term inside the summation of the KullbackLeiblerDivergence.

Informativeness
To compute informativeness properly, v1 has to be taken from a foreground language model LM_fg and v2 from a background language model LM_bg. Then, we can compute the loss in information, if we draw the v1 from LM_bg instead of LM_fg.

Phraseness
v2 is drawn from a lower order language model over the same text collection. In the case of informativeness, the second term was drawn from a completely different collection.

 References:
 T.~Tomokiyo and M.~Hurst. A Language Model Approach to Keyphrase Extraction.
 In Proceedings of the ACL~2003 Workshop on Multiword Expressions, 
 pages 33--40. PDF.
 

Version:
$Id: PointwiseKullbackLeiblerDivergence.java,v 1.1 2012/04/23 13:44:42 hoppe Exp $
Author:
dennis.hoppe(/\t)uni-weimar.de

Constructor Summary
PointwiseKullbackLeiblerDivergence()
           
 
Method Summary
 double compute(java.lang.Double v1, java.lang.Double v2)
           
 
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

PointwiseKullbackLeiblerDivergence

public PointwiseKullbackLeiblerDivergence()
Method Detail

compute

public double compute(java.lang.Double v1,
                      java.lang.Double v2)
Specified by:
compute in interface RelevanceFunction<java.lang.Double,java.lang.Double>