calhoun.analysis.crf.solver
Class MaximumLikelihoodSemiMarkovGradient

java.lang.Object
  extended by calhoun.analysis.crf.solver.MaximumLikelihoodSemiMarkovGradient
All Implemented Interfaces:
CRFObjectiveFunctionGradient

public class MaximumLikelihoodSemiMarkovGradient
extends java.lang.Object
implements CRFObjectiveFunctionGradient

computes the likelihood of the true path for a semi-Markov CRF. The likelihood is normalized to a per label likelihood.

Debugging output

To get a better understanding of what the objective function is doing, several differn properties can be set that cause the objective function to write out trace files showing its calculations during training. Usually when turning these options on, you should set maxIters = 1 and requireConvergence = false in your optimizer to do only a single training iteration, possibly setting the starts to some predetermined value. Each of these properties can be configured with a filename and each time apply(double[], double[]) is called, the file will be overwritten with data from the current call. The logging options are:


Constructor Summary
MaximumLikelihoodSemiMarkovGradient()
           
 
Method Summary
 double apply(double[] param, double[] grad)
          computes the objective function value and the gradient.
 void clean()
          Frees resources allocated by setTrainingData
 java.lang.String getAlphaFile()
           
 java.lang.String getAlphaLengthFile()
           
 java.lang.String getBetaLengthFile()
           
 CacheProcessor getCacheProcessor()
          gets the cache processor used to access feature evaluations
 java.lang.String getExpectFile()
           
 java.lang.String getExpectLengthFile()
           
 double[] getFeatureSums()
           
 java.lang.String getNodeMarginalFile()
           
static java.lang.String printNorm(double value, int norm)
           
 void setAlphaFile(java.lang.String alphaFile)
           
 void setAlphaLengthFile(java.lang.String alphaLengthFile)
           
 void setBetaLengthFile(java.lang.String betaLengthFile)
           
 void setCacheProcessor(CacheProcessor cacheProcessor)
          sets the cache processor used to access feature evaluations
 void setExpectFile(java.lang.String expectFile)
           
 void setExpectLengthFile(java.lang.String expectLengthFile)
           
 void setNodeMarginalFile(java.lang.String nodeMarginalFile)
           
 void setTrainingData(ModelManager fm, java.util.List<? extends TrainingSequence<?>> data)
          sets the training data that will be used for evaluation of the objective function.
 
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

MaximumLikelihoodSemiMarkovGradient

public MaximumLikelihoodSemiMarkovGradient()
Method Detail

setTrainingData

public void setTrainingData(ModelManager fm,
                            java.util.List<? extends TrainingSequence<?>> data)
Description copied from interface: CRFObjectiveFunctionGradient
sets the training data that will be used for evaluation of the objective function. This function will be called before apply is called to set up the training data. Since it is expected that apply will be called many times, this funtion is the place to do one time setup and caching.

Specified by:
setTrainingData in interface CRFObjectiveFunctionGradient
Parameters:
fm - the model to use. Defines the hidden states, transitions, and features.
data - the training sequences on which to calculate the objective function.

apply

public double apply(double[] param,
                    double[] grad)
Description copied from interface: CRFObjectiveFunctionGradient
computes the objective function value and the gradient. This function will be called for each iteration of the numerical solver during training. In each iteration it takes a new set of feature weights and computes a value for the objective function along with its gradient.

Specified by:
apply in interface CRFObjectiveFunctionGradient
Parameters:
param - an array of feature weights to use. The length will equal the number of features in the model, and the values will change for each call to apply.
grad - an array which must be filled with the gradient vector when the function returns. For each feature index, the array should contain an entry with the partial derivative with respect to that feature.
Returns:
the value of the objective function. This is what the numerical optimizer will attempt to maximize.

clean

public void clean()
Description copied from interface: CRFObjectiveFunctionGradient
Frees resources allocated by setTrainingData

Specified by:
clean in interface CRFObjectiveFunctionGradient

printNorm

public static final java.lang.String printNorm(double value,
                                               int norm)

getCacheProcessor

public CacheProcessor getCacheProcessor()
gets the cache processor used to access feature evaluations

Returns:
the configured cache processor

setCacheProcessor

public void setCacheProcessor(CacheProcessor cacheProcessor)
sets the cache processor used to access feature evaluations

Parameters:
cacheProcessor - the cache processor to use

getAlphaLengthFile

public java.lang.String getAlphaLengthFile()

setAlphaLengthFile

public void setAlphaLengthFile(java.lang.String alphaLengthFile)

getAlphaFile

public java.lang.String getAlphaFile()

setAlphaFile

public void setAlphaFile(java.lang.String alphaFile)

getExpectFile

public java.lang.String getExpectFile()

setExpectFile

public void setExpectFile(java.lang.String expectFile)

getExpectLengthFile

public java.lang.String getExpectLengthFile()

setExpectLengthFile

public void setExpectLengthFile(java.lang.String expectLengthFile)

getNodeMarginalFile

public java.lang.String getNodeMarginalFile()

setNodeMarginalFile

public void setNodeMarginalFile(java.lang.String nodeMarginalFile)

getBetaLengthFile

public java.lang.String getBetaLengthFile()

setBetaLengthFile

public void setBetaLengthFile(java.lang.String betaLengthFile)

getFeatureSums

public double[] getFeatureSums()