PETScWrappers::SolverBase Class Reference
[PETScWrappers]

Inheritance diagram for PETScWrappers::SolverBase:

Inheritance graph
[legend]

List of all members.

Classes

class  ExcPETScError
struct  SolverData

Public Member Functions

 SolverBase (SolverControl &cn, const MPI_Comm &mpi_communicator)
virtual ~SolverBase ()
void solve (const MatrixBase &A, VectorBase &x, const VectorBase &b, const PreconditionerBase &preconditioner)
SolverControlcontrol () const

Protected Member Functions

virtual void set_solver_type (KSP &ksp) const =0

Protected Attributes

SolverControlsolver_control
const MPI_Comm mpi_communicator

Static Private Member Functions

static int convergence_test (KSP ksp, const int iteration, const PetscScalar residual_norm, KSPConvergedReason *reason, void *solver_control)

Private Attributes

std_cxx1x::shared_ptr< SolverDatasolver_data


Detailed Description

Base class for solver classes using the PETSc solvers. Since solvers in PETSc are selected based on flags passed to a generic solver object, basically all the actual solver calls happen in this class, and derived classes simply set the right flags to select one solver or another, or to set certain parameters for individual solvers.

One of the gotchas of PETSc is that -- in particular in MPI mode -- it often does not produce very helpful error messages. In order to save other users some time in searching a hard to track down error, here is one situation and the error message one gets there: when you don't specify an MPI communicator to your solver's constructor. In this case, you will get an error of the following form from each of your parallel processes:

 *   [1]PETSC ERROR: PCSetVector() line 1173 in src/ksp/pc/interface/precon.c
 *   [1]PETSC ERROR:   Arguments must have same communicators!
 *   [1]PETSC ERROR:   Different communicators in the two objects: Argument # 1 and 2!
 *   [1]PETSC ERROR: KSPSetUp() line 195 in src/ksp/ksp/interface/itfunc.c
 * 

This error, on which one can spend a very long time figuring out what exactly goes wrong, results from not specifying an MPI communicator. Note that the communicator must match that of the matrix and all vectors in the linear system which we want to solve. Aggravating the situation is the fact that the default argument to the solver classes, PETSC_COMM_SELF, is the appropriate argument for the sequential case (which is why it is the default argument), so this error only shows up in parallel mode.

Author:
Wolfgang Bangerth, 2004

Constructor & Destructor Documentation

PETScWrappers::SolverBase::SolverBase ( SolverControl cn,
const MPI_Comm &  mpi_communicator 
)

Constructor. Takes the solver control object and the MPI communicator over which parallel computations are to happen.

Note that the communicator used here must match the communicator used in the system matrix, solution, and right hand side object of the solve to be done with this solver. Otherwise, PETSc will generate hard to track down errors, see the documentation of the SolverBase class.

virtual PETScWrappers::SolverBase::~SolverBase (  )  [virtual]

Destructor.


Member Function Documentation

void PETScWrappers::SolverBase::solve ( const MatrixBase A,
VectorBase x,
const VectorBase b,
const PreconditionerBase preconditioner 
)

Solve the linear system Ax=b. Depending on the information provided by derived classes and the object passed as a preconditioner, one of the linear solvers and preconditioners of PETSc is chosen.

SolverControl& PETScWrappers::SolverBase::control (  )  const

Access to object that controls convergence.

virtual void PETScWrappers::SolverBase::set_solver_type ( KSP &  ksp  )  const [protected, pure virtual]

static int PETScWrappers::SolverBase::convergence_test ( KSP  ksp,
const int  iteration,
const PetscScalar  residual_norm,
KSPConvergedReason *  reason,
void *  solver_control 
) [static, private]

A function that is used in PETSc as a callback to check on convergence. It takes the information provided from PETSc and checks it against deal.II's own SolverControl objects to see if convergence has been reached.


Member Data Documentation

Reference to the object that controls convergence of the iterative solver. In fact, for these PETSc wrappers, PETSc does so itself, but we copy the data from this object before starting the solution process, and copy the data back into it afterwards.

const MPI_Comm PETScWrappers::SolverBase::mpi_communicator [protected]

Copy of the MPI communicator object to be used for the solver.

std_cxx1x::shared_ptr<SolverData> PETScWrappers::SolverBase::solver_data [private]

Pointer to an object that stores the solver context. This is recreated in the main solver routine if necessary.


The documentation for this class was generated from the following file:

deal.II documentation generated on Sat Aug 15 16:52:44 2009 by doxygen 1.5.9