class DiagonalGMM

This class can be used to model Diagonal Gaussian Mixture Models.

Inheritance:


Public Fields

[more]int n_gaussians
number of Gaussians in the mixture
[more]real prior_weights
prior weights of the Gaussians, used in EM to give a small prior on each Gaussian
[more]EMTrainer* initial_kmeans_trainer
optional initializations if nothing is given, then random, at your own risks.
[more]List* initial_kmeans_trainer_measurers
as well as a measurer of this trainer
[more]List* initial_params
or one can give an initial parameter List
[more]char* initial_file
or one can give an initial file
[more]real* log_weights
the pointers to the parameters
[more]real* dlog_weights
the pointers to the derivative of the parameters
[more]real* var_threshold
this contains the minimal value of each variance
[more]real** log_probabilities_g
for each frame, for each gaussian, keep its log probability
[more]real* sum_log_var_plus_n_obs_log_2_pi
in order to faster the computation, we can do some "pre-computation" pre-computed sum_log_var + n_obs * log_2_pi
[more]real** minus_half_over_var
pre-computed -05 / var
[more]real** means_acc
accumulators for EM

Public Methods

[more] DiagonalGMM(int n_observations_, int n_gaussians_, real* var_threshold_, real prior_weights_)
[more]virtual real frameLogProbabilityOneGaussian(real* observations, real* inputs, int g)
this method returns the log probability of the "g" Gaussian


Inherited from Distribution:

Public Fields

oint n_observations
oint tot_n_frames
oint max_n_frames
oreal log_probability
oreal* log_probabilities

Public Methods

ovirtual real logProbability(List* inputs)
ovirtual real viterbiLogProbability(List* inputs)
ovirtual real frameLogProbability(real* observations, real* inputs, int t)
ovirtual void frameExpectation(real* observations, real* inputs, int t)
ovirtual void eMIterInitialize()
ovirtual void iterInitialize()
ovirtual void eMSequenceInitialize(List* inputs)
ovirtual void sequenceInitialize(List* inputs)
ovirtual void eMAccPosteriors(List* inputs, real log_posterior)
ovirtual void frameEMAccPosteriors(real* observations, real log_posterior, real* inputs, int t)
ovirtual void viterbiAccPosteriors(List* inputs, real log_posterior)
ovirtual void frameViterbiAccPosteriors(real* observations, real log_posterior, real* inputs, int t)
ovirtual void eMUpdate()
ovirtual void decode(List* inputs)
ovirtual void eMForward(List* inputs)
ovirtual void viterbiForward(List* inputs)
ovirtual void frameBackward(real* observations, real* alpha, real* inputs, int t)
ovirtual void viterbiBackward(List* inputs, real* alpha)


Inherited from GradientMachine:

Public Fields

obool is_free
oList* params
oList* der_params
oint n_params
oreal* beta

Public Methods

ovirtual void init()
ovirtual int numberOfParams()
ovirtual void backward(List* inputs, real* alpha)
ovirtual void allocateMemory()
ovirtual void freeMemory()
ovirtual void loadFILE(FILE* file)
ovirtual void saveFILE(FILE* file)


Inherited from Machine:

Public Fields

oint n_inputs
oint n_outputs
oList* outputs

Public Methods

ovirtual void forward(List* inputs)
ovirtual void reset()


Inherited from Object:

Public Methods

ovoid addOption(const char* name, int size, void* ptr, const char* help="", bool is_allowed_after_init=false)
ovoid addIOption(const char* name, int* ptr, int init_value, const char* help="", bool is_allowed_after_init=false)
ovoid addROption(const char* name, real* ptr, real init_value, const char* help="", bool is_allowed_after_init=false)
ovoid addBOption(const char* name, bool* ptr, bool init_value, const char* help="", bool is_allowed_after_init=false)
ovoid setOption(const char* name, void* ptr)
ovoid setIOption(const char* name, int option)
ovoid setROption(const char* name, real option)
ovoid setBOption(const char* name, bool option)
ovoid load(const char* filename)
ovoid save(const char* filename)


Documentation

This class can be used to model Diagonal Gaussian Mixture Models. They can be trained using either EM (with EMTrainer) or gradient descent (with GMTrainer).

oint n_gaussians
number of Gaussians in the mixture

oreal prior_weights
prior weights of the Gaussians, used in EM to give a small prior on each Gaussian

oEMTrainer* initial_kmeans_trainer
optional initializations if nothing is given, then random, at your own risks. one can give a initial trainer containing a kmeans

oList* initial_kmeans_trainer_measurers
as well as a measurer of this trainer

oList* initial_params
or one can give an initial parameter List

ochar* initial_file
or one can give an initial file

oreal* log_weights
the pointers to the parameters

oreal* dlog_weights
the pointers to the derivative of the parameters

oreal* var_threshold
this contains the minimal value of each variance

oreal** log_probabilities_g
for each frame, for each gaussian, keep its log probability

oreal* sum_log_var_plus_n_obs_log_2_pi
in order to faster the computation, we can do some "pre-computation" pre-computed sum_log_var + n_obs * log_2_pi

oreal** minus_half_over_var
pre-computed -05 / var

oreal** means_acc
accumulators for EM

o DiagonalGMM(int n_observations_, int n_gaussians_, real* var_threshold_, real prior_weights_)

ovirtual real frameLogProbabilityOneGaussian(real* observations, real* inputs, int g)
this method returns the log probability of the "g" Gaussian


Direct child classes:
Kmeans
Author:
Samy Bengio (bengio@idiap.ch)

Alphabetic index HTML hierarchy of classes or Java



This page was generated with the help of DOC++.