go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes
itk::FiniteDifferenceGradientDescentOptimizer Class Reference

#include <itkFiniteDifferenceGradientDescentOptimizer.h>

Inheritance diagram for itk::FiniteDifferenceGradientDescentOptimizer:
Inheritance graph
[legend]
Collaboration diagram for itk::FiniteDifferenceGradientDescentOptimizer:
Collaboration graph
[legend]

Public Types

typedef SmartPointer< const SelfConstPointer
typedef SmartPointer< SelfPointer
typedef
FiniteDifferenceGradientDescentOptimizer 
Self
enum  StopConditionType { MaximumNumberOfIterations, MetricError }
typedef
ScaledSingleValuedNonLinearOptimizer 
Superclass

Public Member Functions

virtual void AdvanceOneStep (void)
virtual void ComputeCurrentValueOff ()
virtual void ComputeCurrentValueOn ()
virtual const char * GetClassName () const
virtual bool GetComputeCurrentValue () const
virtual unsigned long GetCurrentIteration () const
virtual double GetGradientMagnitude () const
virtual double GetLearningRate () const
virtual unsigned long GetNumberOfIterations () const
virtual double GetParam_a ()
virtual double GetParam_A ()
virtual double GetParam_alpha ()
virtual double GetParam_c ()
virtual double GetParam_gamma ()
virtual StopConditionType GetStopCondition () const
virtual double GetValue () const
void ResumeOptimization (void)
virtual void SetComputeCurrentValue (bool _arg)
virtual void SetNumberOfIterations (unsigned long _arg)
virtual void SetParam_a (double _arg)
virtual void SetParam_A (double _arg)
virtual void SetParam_alpha (double _arg)
virtual void SetParam_c (double _arg)
virtual void SetParam_gamma (double _arg)
void StartOptimization (void)
void StopOptimization (void)

Static Public Member Functions

static Pointer New ()

Protected Member Functions

virtual double Compute_a (unsigned long k) const
virtual double Compute_c (unsigned long k) const
 FiniteDifferenceGradientDescentOptimizer ()
void PrintSelf (std::ostream &os, Indent indent) const
virtual ~FiniteDifferenceGradientDescentOptimizer ()

Protected Attributes

bool m_ComputeCurrentValue
DerivativeType m_Gradient
double m_GradientMagnitude
double m_LearningRate

Private Member Functions

 FiniteDifferenceGradientDescentOptimizer (const Self &)
void operator= (const Self &)

Private Attributes

unsigned long m_CurrentIteration
unsigned long m_NumberOfIterations
double m_Param_a
double m_Param_A
double m_Param_alpha
double m_Param_c
double m_Param_gamma
bool m_Stop
StopConditionType m_StopCondition
double m_Value

Detailed Description

An optimizer based on gradient descent ...

If $C(x)$ is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters x:

\[ x(k+1)_j = x(k)_j - a(k) \left[ C(x(k)_j + c(k)) - C(x(k)_j - c(k)) \right] / 2c(k), \]

for all parameters $j$.

From this equation it is clear that it a gradient descent optimizer, using a finite difference approximation of the gradient.

The gain $a(k)$ at each iteration $k$ is defined by:

\[ a(k) = a / (A + k + 1)^{\alpha}. \]

The perturbation size $c(k)$ at each iteration $k$ is defined by:

\[ c(k) = c / (k + 1)^{\gamma}. \]

Note the similarities to the SimultaneousPerturbation optimizer and the StandardGradientDescent optimizer.

See also:
FiniteDifferenceGradientDescent

Definition at line 52 of file itkFiniteDifferenceGradientDescentOptimizer.h.


Member Typedef Documentation

Standard class typedefs.

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

Definition at line 58 of file itkFiniteDifferenceGradientDescentOptimizer.h.


Member Enumeration Documentation

Codes of stopping conditions

Enumerator:
MaximumNumberOfIterations 
MetricError 

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

Definition at line 70 of file itkFiniteDifferenceGradientDescentOptimizer.h.


Constructor & Destructor Documentation


Member Function Documentation

Advance one step following the gradient direction.

virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_a ( unsigned long  k) const [protected, virtual]
virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_c ( unsigned long  k) const [protected, virtual]
virtual const char* itk::FiniteDifferenceGradientDescentOptimizer::GetClassName ( ) const [virtual]

Run-time type information (and related methods).

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetCurrentIteration ( ) const [virtual]

Get the current iteration number.

Get the CurrentStepLength, GradientMagnitude and LearningRate (a_k)

Get the number of iterations.

Get Stop condition.

Get the current value.

Method for creation through the object factory.

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

void itk::FiniteDifferenceGradientDescentOptimizer::operator= ( const Self ) [private]
void itk::FiniteDifferenceGradientDescentOptimizer::PrintSelf ( std::ostream &  os,
Indent  indent 
) const [protected]

PrintSelf method.

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Resume previously stopped optimization with current parameters

See also:
StopOptimization.
virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetNumberOfIterations ( unsigned long  _arg) [virtual]

Set the number of iterations.

Set/Get a.

Set/Get A.

Set/Get alpha.

Set/Get c.

Set/Get gamma.

Start optimization.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

Stop optimization.

See also:
ResumeOptimization

Field Documentation

Boolean that says if the current value of the metric has to be computed. This is not necessary for optimisation; just nice for progress information.

Definition at line 151 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Parameters, as described by Spall.

Definition at line 170 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Private member variables.

Definition at line 163 of file itkFiniteDifferenceGradientDescentOptimizer.h.



Generated on 11-05-2012 for elastix by doxygen 1.7.6.1 elastix logo