Classes | |
class | ExcPETScError |
struct | SolverData |
Public Member Functions | |
SolverBase (SolverControl &cn, const MPI_Comm &mpi_communicator) | |
virtual | ~SolverBase () |
void | solve (const MatrixBase &A, VectorBase &x, const VectorBase &b, const PreconditionerBase &preconditioner) |
SolverControl & | control () const |
Protected Member Functions | |
virtual void | set_solver_type (KSP &ksp) const =0 |
Protected Attributes | |
SolverControl & | solver_control |
const MPI_Comm | mpi_communicator |
Static Private Member Functions | |
static int | convergence_test (KSP ksp, const int iteration, const PetscScalar residual_norm, KSPConvergedReason *reason, void *solver_control) |
Private Attributes | |
std_cxx1x::shared_ptr< SolverData > | solver_data |
One of the gotchas of PETSc is that -- in particular in MPI mode -- it often does not produce very helpful error messages. In order to save other users some time in searching a hard to track down error, here is one situation and the error message one gets there: when you don't specify an MPI communicator to your solver's constructor. In this case, you will get an error of the following form from each of your parallel processes:
* [1]PETSC ERROR: PCSetVector() line 1173 in src/ksp/pc/interface/precon.c * [1]PETSC ERROR: Arguments must have same communicators! * [1]PETSC ERROR: Different communicators in the two objects: Argument # 1 and 2! * [1]PETSC ERROR: KSPSetUp() line 195 in src/ksp/ksp/interface/itfunc.c *
This error, on which one can spend a very long time figuring out what exactly goes wrong, results from not specifying an MPI communicator. Note that the communicator must match that of the matrix and all vectors in the linear system which we want to solve. Aggravating the situation is the fact that the default argument to the solver classes, PETSC_COMM_SELF
, is the appropriate argument for the sequential case (which is why it is the default argument), so this error only shows up in parallel mode.
PETScWrappers::SolverBase::SolverBase | ( | SolverControl & | cn, | |
const MPI_Comm & | mpi_communicator | |||
) |
Constructor. Takes the solver control object and the MPI communicator over which parallel computations are to happen.
Note that the communicator used here must match the communicator used in the system matrix, solution, and right hand side object of the solve to be done with this solver. Otherwise, PETSc will generate hard to track down errors, see the documentation of the SolverBase class.
virtual PETScWrappers::SolverBase::~SolverBase | ( | ) | [virtual] |
Destructor.
void PETScWrappers::SolverBase::solve | ( | const MatrixBase & | A, | |
VectorBase & | x, | |||
const VectorBase & | b, | |||
const PreconditionerBase & | preconditioner | |||
) |
Solve the linear system Ax=b
. Depending on the information provided by derived classes and the object passed as a preconditioner, one of the linear solvers and preconditioners of PETSc is chosen.
SolverControl& PETScWrappers::SolverBase::control | ( | ) | const |
Access to object that controls convergence.
virtual void PETScWrappers::SolverBase::set_solver_type | ( | KSP & | ksp | ) | const [protected, pure virtual] |
Function that takes a Krylov Subspace Solver context object, and sets the type of solver that is requested by the derived class.
Implemented in PETScWrappers::SolverRichardson, PETScWrappers::SolverChebychev, PETScWrappers::SolverCG, PETScWrappers::SolverBiCG, PETScWrappers::SolverGMRES, PETScWrappers::SolverBicgstab, PETScWrappers::SolverCGS, PETScWrappers::SolverTFQMR, PETScWrappers::SolverTCQMR, PETScWrappers::SolverCR, PETScWrappers::SolverLSQR, and PETScWrappers::SolverPreOnly.
static int PETScWrappers::SolverBase::convergence_test | ( | KSP | ksp, | |
const int | iteration, | |||
const PetscScalar | residual_norm, | |||
KSPConvergedReason * | reason, | |||
void * | solver_control | |||
) | [static, private] |
A function that is used in PETSc as a callback to check on convergence. It takes the information provided from PETSc and checks it against deal.II's own SolverControl objects to see if convergence has been reached.
SolverControl& PETScWrappers::SolverBase::solver_control [protected] |
Reference to the object that controls convergence of the iterative solver. In fact, for these PETSc wrappers, PETSc does so itself, but we copy the data from this object before starting the solution process, and copy the data back into it afterwards.
const MPI_Comm PETScWrappers::SolverBase::mpi_communicator [protected] |
Copy of the MPI communicator object to be used for the solver.
std_cxx1x::shared_ptr<SolverData> PETScWrappers::SolverBase::solver_data [private] |
Pointer to an object that stores the solver context. This is recreated in the main solver routine if necessary.