PETSc namespace
PETSc integration module for Rodin.
The PETSc module provides integration with the Portable, Extensible Toolkit for Scientific Computation (PETSc), enabling the use of distributed and parallel linear algebra objects, Krylov solvers, and nonlinear solver frameworks within Rodin's finite element pipeline.
Submodules
- Math — PETSc vector, matrix, and linear system wrappers
- Solver — KSP, SNES, CG, and GMRES solver aliases
- Assembly — Assembly strategies for PETSc objects (sequential, MPI, OpenMP)
- Variational — Trial/test functions, forms, grid functions, and problems backed by PETSc
The PETSc module provides integration with the Portable, Extensible Toolkit for Scientific Computation (PETSc), enabling the use of distributed and parallel linear algebra objects, Krylov solvers, and nonlinear solver frameworks within Rodin's finite element pipeline.
Key Components
Math types** (Rodin::PETSc::Math):
- LinearSystem — Couples a PETSc
Mat(operator ), right-hand sideVec( ), and solutionVec( ). Supports Dirichlet elimination viaeliminate()and block-preconditioner field splits viasetFieldSplits(). Vector(::Vec) andMatrix(::Mat) — PETSc handle aliases.Solvers** (Rodin::PETSc::Solver):
- KSP — Wraps the PETSc KSP (Krylov subspace) context. Configurable via
setType(),setTolerances(),setPreconditioner(), and PETSc command-line options (-ksp_type,-ksp_rtol,-ksp_monitor, etc.). - CG — CG specialization setting
KSPCG. For symmetric positive-definite systems. - GMRES — GMRES specialization setting
KSPGMRES. For non-symmetric/indefinite systems. SNES — Wraps the PETSc SNES (Scalable Nonlinear Equations Solvers) for Newton-type methods. The inner linear solve uses
KSP.Variational** (Rodin::PETSc::Variational):
PETSc::/Variational:: TrialFunction TestFunction— Thin wrappers that trigger PETSc-specific CTAD so thatProblem,BilinearForm, andLinearFormautomatically use PETSc storage.PETSc::— Stores DOF coefficients in a PETScVariational:: GridFunction Vec. In MPI mode, the vector is created withVecMPISetGhostfor transparent ghost-layer synchronization.Problem— Assembles into a PETScLinearSystem. Supports both single-field (Problem(u, v)) and multi-field block systems (Problem(u, p, l, v, q, m)) with field-split preconditioning.BilinearForm/LinearForm— Assemble into PETScMat/Vec.Assembly** (Rodin::
PETSc:: Assembly): Sequential— Single-rank assembly into PETSc objects.MPI— Distributed assembly where each rank contributes owned elements.OpenMP— Shared-memory parallel assembly on a single rank.
Typical Usage (Sequential)
#include <Rodin/PETSc.h> PetscInitialize(&argc, &argv, PETSC_NULLPTR, PETSC_NULLPTR); Mesh mesh; mesh = mesh.UniformGrid(Polytope::Type::Triangle, {16, 16}); mesh.getConnectivity().compute(1, 2); P1 Vh(mesh); PETSc::Variational::TrialFunction u(Vh); PETSc::Variational::TestFunction v(Vh); Problem poisson(u, v); poisson = Integral(Grad(u), Grad(v)) - Integral(f, v) + DirichletBC(u, Zero()); Solver::CG(poisson).solve(); PetscFinalize();
Typical Usage (MPI + PETSc)
#include <Rodin/MPI.h> #include <Rodin/PETSc.h> boost::mpi::environment env(argc, argv); boost::mpi::communicator world; Context::MPI mpi(env, world); PetscInitialize(&argc, &argv, PETSC_NULLPTR, PETSC_NULLPTR); // Partition and distribute mesh MPI::Sharder sharder(mpi); // ... (see MPI module documentation) auto mesh = sharder.gather(0); mesh.getConnectivity().compute(2, 3); mesh.reconcile(2); P1 Vh(mesh); PETSc::Variational::TrialFunction u(Vh); PETSc::Variational::TestFunction v(Vh); Problem poisson(u, v); poisson = Integral(Grad(u), Grad(v)) - Integral(f, v) + DirichletBC(u, Zero()); Solver::CG(poisson).solve(); PetscFinalize();
Multi-Field Problems (Stokes)
PETSc problems support an arbitrary number of coupled trial/test function pairs for saddle-point systems:
H1 uh(std::integral_constant<size_t, 2>{}, mesh, mesh.getSpaceDimension()); H1 ph(std::integral_constant<size_t, 1>{}, mesh); P0g lh(mesh); PETSc::Variational::TrialFunction u(uh); PETSc::Variational::TrialFunction p(ph); PETSc::Variational::TrialFunction l(lh); PETSc::Variational::TestFunction v(uh); PETSc::Variational::TestFunction q(ph); PETSc::Variational::TestFunction m(lh); Problem stokes(u, p, l, v, q, m); stokes = Integral(Jacobian(u), Jacobian(v)) - Integral(p, Div(v)) + Integral(Div(u), q) + Integral(l, q) + Integral(p, m) - Integral(f, v) + DirichletBC(u, u_exact); Solver::KSP(stokes).solve();
PETSc command-line options can be used to configure field-split preconditioning: ./Stokes -ksp_type gmres -pc_type fieldsplit -ksp_rtol 1e-8
Namespaces
- namespace Assembly
- PETSc-specific assembly strategies.
Classes
Typedefs
- using Scalar = PetscScalar
- Scalar type used by PETSc vectors and matrices.
- using Integer = PetscInt
- Integer index type used by PETSc row/column indices.
- using Real = PetscReal
- Real-valued floating-point type (tolerances, norms, etc.).
- using Complex = PetscComplex
- Complex scalar type (available when PETSc is built with complex support).
Typedef documentation
using Rodin:: PETSc:: Scalar = PetscScalar
#include <Rodin/PETSc/Types.h>
Scalar type used by PETSc vectors and matrices.
using Rodin:: PETSc:: Integer = PetscInt
#include <Rodin/PETSc/Types.h>
Integer index type used by PETSc row/column indices.
using Rodin:: PETSc:: Real = PetscReal
#include <Rodin/PETSc/Types.h>
Real-valued floating-point type (tolerances, norms, etc.).
using Rodin:: PETSc:: Complex = PetscComplex
#include <Rodin/PETSc/Types.h>
Complex scalar type (available when PETSc is built with complex support).