calhoun.analysis.crf.solver
Class MaximumLikelihoodGradient

java.lang.Object
  extended by calhoun.analysis.crf.solver.MaximumLikelihoodGradient
All Implemented Interfaces:
CRFObjectiveFunctionGradient

public class MaximumLikelihoodGradient
extends java.lang.Object
implements CRFObjectiveFunctionGradient

computes the likelihood of the true path for a Markov CRF. The likelihood is normalized to a per label likelihood so that likelihood of different length paths can be meaningfully compared and a single set of optimization tolerances can be used. Must be configured with a CacheProcessor.


Constructor Summary
MaximumLikelihoodGradient()
           
 
Method Summary
 double apply(double[] param, double[] grad)
          computes the objective function value and the gradient.
 void clean()
          Frees resources allocated by setTrainingData
 CacheProcessor getCacheProcessor()
          gets the cache processor used to access feature evaluations
 double[] getFeatureSums()
           
 void setCacheProcessor(CacheProcessor cacheProcessor)
          sets the cache processor used to access feature evaluations
 void setTrainingData(ModelManager fm, java.util.List<? extends TrainingSequence<?>> data)
          sets the training data that will be used for evaluation of the objective function.
 
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

MaximumLikelihoodGradient

public MaximumLikelihoodGradient()
Method Detail

getCacheProcessor

public CacheProcessor getCacheProcessor()
gets the cache processor used to access feature evaluations

Returns:
the configured cache processor

setCacheProcessor

public void setCacheProcessor(CacheProcessor cacheProcessor)
sets the cache processor used to access feature evaluations

Parameters:
cacheProcessor - the cache processor to use

setTrainingData

public void setTrainingData(ModelManager fm,
                            java.util.List<? extends TrainingSequence<?>> data)
Description copied from interface: CRFObjectiveFunctionGradient
sets the training data that will be used for evaluation of the objective function. This function will be called before apply is called to set up the training data. Since it is expected that apply will be called many times, this funtion is the place to do one time setup and caching.

Specified by:
setTrainingData in interface CRFObjectiveFunctionGradient
Parameters:
fm - the model to use. Defines the hidden states, transitions, and features.
data - the training sequences on which to calculate the objective function.

apply

public double apply(double[] param,
                    double[] grad)
Description copied from interface: CRFObjectiveFunctionGradient
computes the objective function value and the gradient. This function will be called for each iteration of the numerical solver during training. In each iteration it takes a new set of feature weights and computes a value for the objective function along with its gradient.

Specified by:
apply in interface CRFObjectiveFunctionGradient
Parameters:
param - an array of feature weights to use. The length will equal the number of features in the model, and the values will change for each call to apply.
grad - an array which must be filled with the gradient vector when the function returns. For each feature index, the array should contain an entry with the partial derivative with respect to that feature.
Returns:
the value of the objective function. This is what the numerical optimizer will attempt to maximize.

clean

public void clean()
Description copied from interface: CRFObjectiveFunctionGradient
Frees resources allocated by setTrainingData

Specified by:
clean in interface CRFObjectiveFunctionGradient

getFeatureSums

public double[] getFeatureSums()