General concepts » PETSc: Parallel Solvers and Distributed Linear Algebra

Guide to using PETSc-backed variational forms, solvers, and distributed assembly.

Introduction

Rodin's PETSc module integrates the Portable, Extensible Toolkit for Scientific Computation (PETSc) into the finite element pipeline. It provides:

  • PETSc-backed trial/test functions, grid functions, bilinear forms, and linear forms that store data in PETSc Vec and Mat objects.
  • KSP (Krylov subspace) and SNES (nonlinear) solver wrappers.
  • Sequential, OpenMP, and MPI assembly strategies targeting PETSc objects.
  • Seamless support for both Context::Local (single-process) and Context::MPI (distributed) mesh contexts.

Include the module via: #include <Rodin/PETSc.h>

PETSc Variational Components

The PETSc variational wrappers live in the Rodin::PETSc::Variational namespace. They are thin CTAD wrappers around the core Rodin classes that trigger PETSc-specific template deduction.

Trial and Test Functions

Use PETSc::Variational::TrialFunction and PETSc::Variational::TestFunction instead of the standard versions to activate PETSc storage:

P1 Vh(mesh);

PETSc::Variational::TrialFunction u(Vh);
PETSc::Variational::TestFunction  v(Vh);

The CTAD deduction guide automatically associates the trial function with a PETSc::Variational::GridFunction as its solution type, ensuring that the discrete solution is stored in a PETSc Vec.

Grid Functions

PETSc::Variational::GridFunction stores DOF coefficients in a PETSc Vec. It supports:

FeatureDescription
Projectiongf = expr assigns DOF values from a function expression
Arithmetic+=, -=, *=, /= for scalars and grid functions
Point evaluationgf(point) evaluates $ u_h $ at a geometric point
DOF accessgf[i] accesses individual DOF coefficients
Data importsetData(Vec) copies from another PETSc vector
I/Osave() / load() for persistence

In MPI mode, the vector is created with VecMPISetGhost for transparent ghost-layer synchronization. The acquire() / flush() / release() methods encapsulate ghost update patterns:

DirectionPETSc callPurpose
owner → ghostVecGhostUpdateBegin/End(INSERT, FORWARD)Read remote DOFs before evaluation
ghost → ownerVecGhostUpdateBegin/End(INSERT, REVERSE)Scatter modified ghosts back to owners

Bilinear and Linear Forms

When PETSc trial/test functions are used, BilinearForm assembles into a PETSc Mat and LinearForm assembles into a PETSc Vec:

// Automatically selected via CTAD
BilinearForm a(u, v);  // assembles into ::Mat
a = Integral(Grad(u), Grad(v));

LinearForm L(v);       // assembles into ::Vec
L = Integral(f, v);

The assembly strategy is automatically selected based on the mesh context:

  • Context::Local → Sequential or OpenMP assembly
  • Context::MPI → MPI-parallel assembly (each rank assembles owned elements)

Problem

The PETSc Problem assembles into a PETSc::Math::LinearSystem containing the system matrix $ A $ , right-hand side $ \mathbf{b} $ , and solution $ \mathbf{x} $ :

Problem poisson(u, v);
poisson = Integral(Grad(u), Grad(v))
        - Integral(f, v)
        + DirichletBC(u, Zero());

// Assembly + solve
Solver::CG(poisson).solve();

// The solution is automatically scattered to u.getSolution()

PETSc Solvers

KSP: Krylov Subspace Solvers

KSP is the base PETSc linear solver. It wraps the PETSc KSP context and supports programmatic configuration as well as command-line overrides.

// Direct usage
Solver::KSP(problem).solve();

// Configured usage
Solver::KSP ksp(problem);
ksp.setType(KSPGMRES);
ksp.setTolerances(
  1e-10,    // rtol:   relative decrease in residual norm
  1e-14,    // abstol: absolute residual norm threshold
  1e5,      // dtol:   divergence tolerance
  1000      // maxIt:  maximum iterations
);
ksp.setPreconditioner(precondMat);
ksp.solve();

PETSc command-line options override programmatic settings: ./myprogram -ksp_type cg -pc_type jacobi -ksp_rtol 1e-8 -ksp_monitor

CG (Conjugate Gradient)

CG is a convenience specialization that sets the KSP type to KSPCG. Use for symmetric positive-definite (SPD) systems:

Solver::CG(poisson).solve();

GMRES

GMRES sets the KSP type to KSPGMRES. Use for non-symmetric or indefinite systems:

Solver::GMRES(stokes).solve();

SNES: Nonlinear Solvers

SNES wraps PETSc's nonlinear solver framework. It uses a KSP solver for the inner linear system at each Newton step:

Solver::KSP ksp(problem);
Solver::SNES snes(ksp);
snes.setType(SNESNEWTONLS);
snes.setTolerances(
  1e-8,     // abstol
  1e-10,    // rtol
  1e-6,     // stol (step norm)
  50,       // maxIt (nonlinear iterations)
  10000     // maxF (function evaluations)
);
snes.solve(x);

Linear System

PETSc::Math::LinearSystem couples the PETSc matrix, RHS vector, and solution vector:

auto& axb = problem.getLinearSystem();
auto& A = axb.getOperator();   // ::Mat
auto& b = axb.getVector();     // ::Vec (RHS)
auto& x = axb.getSolution();   // ::Vec (solution)

Dirichlet Elimination

Dirichlet boundary conditions are enforced by the eliminate() template method, which zeros constrained rows in $ A $ and adjusts $ \mathbf{b} $ and $ \mathbf{x} $ :

\[ A_{ii} \leftarrow 1, \quad A_{ij} \leftarrow 0 \; (j \ne i), \quad b_i \leftarrow u_i, \quad x_i \leftarrow u_i \]

This is handled automatically by DirichletBC(u, g) during assembly.

Field-Split Preconditioning

For multi-field problems (e.g. Stokes), the linear system supports named field splits that can be passed to PETSc's PCFIELDSPLIT preconditioner:

// The Problem class automatically sets up field splits for multi-field
// problems. Configure via command line:
//   -pc_type fieldsplit
//   -pc_fieldsplit_type schur
//   -fieldsplit_velocity_ksp_type cg
//   -fieldsplit_pressure_ksp_type preonly

Assembly Strategies

The PETSc module provides three assembly strategies:

StrategyHeaderContextDescription
SequentialPETSc/Assembly/Sequential.hContext::LocalSingle-thread assembly into PETSc objects
OpenMPPETSc/Assembly/OpenMP.hContext::LocalShared-memory parallel assembly
MPIPETSc/Assembly/MPI.hContext::MPIDistributed assembly (each rank contributes owned elements)

The assembly strategy is selected automatically via the Assembly::Default<TrialContext, TestContext> trait based on the mesh context. Users do not normally need to select it explicitly.

Multi-Field Problems

PETSc problems support an arbitrary number of coupled trial/test function pairs. This is essential for saddle-point systems like Stokes, which involve velocity, pressure, and (optionally) a Lagrange multiplier for the mean-pressure constraint.

// FE spaces: H1(2) for velocity, H1(1) for pressure, P0g for LM
// The first argument to H1 is a compile-time order using std::integral_constant
H1  uh(std::integral_constant<size_t, 2>{}, mesh, mesh.getSpaceDimension());
H1  ph(std::integral_constant<size_t, 1>{}, mesh);
P0g lh(mesh);

PETSc::Variational::TrialFunction u(uh); u.setName("u");
PETSc::Variational::TrialFunction p(ph); p.setName("p");
PETSc::Variational::TrialFunction l(lh); l.setName("lambda");

PETSc::Variational::TestFunction v(uh);
PETSc::Variational::TestFunction q(ph);
PETSc::Variational::TestFunction m(lh);

Problem stokes(u, p, l, v, q, m);
stokes = Integral(Jacobian(u), Jacobian(v))
       - Integral(p, Div(v))
       + Integral(Div(u), q)
       + Integral(l, q)
       + Integral(p, m)
       + 1e-12 * Integral(p, q)
       + 1e-12 * Integral(l, m)
       - Integral(f, v)
       + DirichletBC(u, u_exact);

stokes.assemble();
Solver::KSP(stokes).solve();

Sequential PETSc Example

This example solves a 3D Poisson equation on hexahedra using PETSc CG:

#include <Rodin/PETSc.h>
#include <Rodin/Geometry.h>
#include <Rodin/Variational.h>

using namespace Rodin;
using namespace Rodin::Geometry;
using namespace Rodin::Variational;

int main(int argc, char** argv)
{
  PetscInitialize(&argc, &argv, PETSC_NULLPTR, PETSC_NULLPTR);

  Mesh mesh;
  mesh = mesh.UniformGrid(Polytope::Type::Hexahedron, {16, 16, 16});
  mesh.scale(1.0 / 15);
  mesh.getConnectivity().compute(2, 3);
  mesh.getConnectivity().compute(3, 2);

  RealFunction f = 1;
  P1 Vh(mesh);

  PETSc::Variational::TrialFunction u(Vh);
  PETSc::Variational::TestFunction  v(Vh);

  Problem poisson(u, v);
  poisson = Integral(Grad(u), Grad(v))
          - Integral(f, v)
          + DirichletBC(u, Zero());

  Solver::CG(poisson).solve();

  u.getSolution().save("Poisson.gf", IO::FileFormat::MFEM);
  mesh.save("Poisson.mesh", IO::FileFormat::MFEM);

  PetscFinalize();
}

MPI + PETSc Example

See the MPI guide for the full distributed workflow. The key difference from sequential usage is:

  1. Create a Context::MPI and initialize PETSc.
  2. Partition and distribute the mesh using Sharder.
  3. Reconcile shared entities.
  4. Use PETSc::Variational::TrialFunction / TestFunction as usual.
  5. The assembly and solve are automatically distributed.

Command-Line Configuration

All PETSc solvers support command-line configuration via PETSc's options database. This is especially powerful for experimenting with preconditioners and solver parameters without recompiling:

OptionDescriptionExample
-ksp_typeKrylov method-ksp_type gmres
-pc_typePreconditioner-pc_type gamg
-ksp_rtolRelative tolerance-ksp_rtol 1e-10
-ksp_monitorPrint residual each iteration
-ksp_viewPrint solver configuration
-snes_typeNonlinear method-snes_type newtonls
-snes_monitorPrint nonlinear residual
-pc_type fieldsplitBlock preconditioner

See Also