go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes
itk::StandardGradientDescentOptimizer Class Reference

#include <itkStandardGradientDescentOptimizer.h>

Inheritance diagram for itk::StandardGradientDescentOptimizer:
Inheritance graph
[legend]
Collaboration diagram for itk::StandardGradientDescentOptimizer:
Collaboration graph
[legend]

Public Types

typedef SmartPointer< const SelfConstPointer
typedef
Superclass::CostFunctionType 
CostFunctionType
typedef Superclass::DerivativeType DerivativeType
typedef Superclass::MeasureType MeasureType
typedef Superclass::ParametersType ParametersType
typedef SmartPointer< SelfPointer
typedef
Superclass::ScaledCostFunctionPointer 
ScaledCostFunctionPointer
typedef
Superclass::ScaledCostFunctionType 
ScaledCostFunctionType
typedef Superclass::ScalesType ScalesType
typedef
StandardGradientDescentOptimizer 
Self
typedef
Superclass::StopConditionType 
StopConditionType
typedef GradientDescentOptimizer2 Superclass

Public Member Functions

virtual void AdvanceOneStep (void)
virtual const char * GetClassName () const
virtual double GetCurrentTime () const
virtual double GetInitialTime () const
virtual double GetParam_a () const
virtual double GetParam_A () const
virtual double GetParam_alpha () const
virtual void ResetCurrentTimeToInitialTime (void)
virtual void SetInitialTime (double _arg)
virtual void SetParam_a (double _arg)
virtual void SetParam_A (double _arg)
virtual void SetParam_alpha (double _arg)
virtual void StartOptimization (void)

Static Public Member Functions

static Pointer New ()

Protected Member Functions

virtual double Compute_a (double k) const
 StandardGradientDescentOptimizer ()
virtual void UpdateCurrentTime (void)
virtual ~StandardGradientDescentOptimizer ()

Protected Attributes

double m_CurrentTime

Private Member Functions

void operator= (const Self &)
 StandardGradientDescentOptimizer (const Self &)

Private Attributes

double m_InitialTime
double m_Param_a
double m_Param_A
double m_Param_alpha

Detailed Description

This class implements a gradient descent optimizer with a decaying gain.

If $C(x)$ is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters $x$:

\[ x(k+1) = x(k) - a(k) dC/dx \]

The gain $a(k)$ at each iteration $k$ is defined by:

\[ a(k) = a / (A + k + 1)^alpha \]

.

It is very suitable to be used in combination with a stochastic estimate of the gradient $dC/dx$. For example, in image registration problems it is often advantageous to compute the metric derivative ( $dC/dx$) on a new set of randomly selected image samples in each iteration. You may set the parameter NewSamplesEveryIteration to "true" to achieve this effect. For more information on this strategy, you may have a look at:

S. Klein, M. Staring, J.P.W. Pluim, "Comparison of gradient approximation techniques for optimisation of mutual information in nonrigid registration", in: SPIE Medical Imaging: Image Processing, Editor(s): J.M. Fitzpatrick, J.M. Reinhardt, SPIE press, 2005, vol. 5747, Proceedings of SPIE, pp. 192-203.

Or:

S. Klein, M. Staring, J.P.W. Pluim, "Evaluation of Optimization Methods for Nonrigid Medical Image Registration using Mutual Information and B-Splines" IEEE Transactions on Image Processing, 2007, nr. 16(12), December.

This class also serves as a base class for other GradientDescent type algorithms, like the AcceleratedGradientDescentOptimizer.

See also:
StandardGradientDescent, AcceleratedGradientDescentOptimizer

Definition at line 61 of file itkStandardGradientDescentOptimizer.h.


Member Typedef Documentation

Typedefs inherited from the superclass.

Reimplemented from itk::GradientDescentOptimizer2.

Reimplemented in itk::AdaptiveStochasticGradientDescentOptimizer.

Definition at line 77 of file itkStandardGradientDescentOptimizer.h.


Constructor & Destructor Documentation

Definition at line 131 of file itkStandardGradientDescentOptimizer.h.


Member Function Documentation

virtual void itk::StandardGradientDescentOptimizer::AdvanceOneStep ( void  ) [virtual]

Sets a new LearningRate before calling the Superclass' implementation, and updates the current time.

Reimplemented from itk::GradientDescentOptimizer2.

virtual double itk::StandardGradientDescentOptimizer::Compute_a ( double  k) const [protected, virtual]

Function to compute the parameter at time/iteration k.

virtual const char* itk::StandardGradientDescentOptimizer::GetClassName ( ) const [virtual]

Get the current time. This equals the CurrentIteration in this base class but may be different in inheriting classes, such as the AccelerateGradientDescent

void itk::StandardGradientDescentOptimizer::operator= ( const Self ) [private]

Set the current time to the initial time. This can be useful to 'reset' the optimisation, for example if you changed the cost function while optimisation. Be careful with this function.

Definition at line 123 of file itkStandardGradientDescentOptimizer.h.

Set/Get the initial time. Should be >=0. This function is superfluous, since Param_A does effectively the same. However, in inheriting classes, like the AcceleratedGradientDescent the initial time may have a different function than Param_A. Default: 0.0

Set/Get a.

Set/Get A.

Set/Get alpha.

Set current time to 0 and call superclass' implementation.

Reimplemented from itk::GradientDescentOptimizer2.

Reimplemented in elastix::AdaptiveStochasticGradientDescent< TElastix >, and elastix::StandardGradientDescent< TElastix >.

virtual void itk::StandardGradientDescentOptimizer::UpdateCurrentTime ( void  ) [protected, virtual]

Function to update the current time This function just increments the CurrentTime by 1. Inheriting functions may implement something smarter, for example, dependent on the progress

Reimplemented in itk::AdaptiveStochasticGradientDescentOptimizer.


Field Documentation

The current time, which serves as input for Compute_a

Definition at line 143 of file itkStandardGradientDescentOptimizer.h.

Settings

Definition at line 156 of file itkStandardGradientDescentOptimizer.h.

Parameters, as described by Spall.

Definition at line 151 of file itkStandardGradientDescentOptimizer.h.

Definition at line 152 of file itkStandardGradientDescentOptimizer.h.

Definition at line 153 of file itkStandardGradientDescentOptimizer.h.



Generated on 11-05-2012 for elastix by doxygen 1.7.6.1 elastix logo