petsc_direct_solver.py

petsc_direct_solver.py#

LinearSolver that uses PETSc for LU factor/solve.

class openmdao.solvers.linear.petsc_direct_solver.PETScDirectSolver(**kwargs)[source]

Bases: DirectSolver

LinearSolver that uses PETSc for LU factor/solve.

Parameters:
**kwargsdict

Options dictionary.

Attributes:
_PETSc<petsc4py.PETSc>

A lazily imported petsc4py.PETSc module.

Methods

add_recorder(recorder)

Add a recorder to the solver's RecordingManager.

can_solve_cycle()

Return True if this solver can solve groups with cycles.

check_config(logger)

Perform optional error checks.

cleanup()

Clean up resources prior to exit.

does_recursive_applies()

Return False.

get_outputs_dir(*subdirs[, mkdir])

Get the path under which all output files of this solver are to be placed.

get_reports_dir()

Get the path to the directory where the report files should go.

preferred_sparse_format()

Return the preferred sparse format for the dr/do matrix of a split jacobian.

raise_petsc_error(e, system, matrix)

Raise an error based on the issue that PETSc had with the linearize.

record_iteration(**kwargs)

Record an iteration of the current Solver.

report_failure(msg)

Report a failure that has occurred.

solve(mode[, rel_systems])

Run the solver.

use_relevance()

Return True if relevance should be active.

SOLVER = 'LN: PETScDirect'
__init__(**kwargs)[source]

Declare the solver options.

check_config(logger)[source]

Perform optional error checks.

Parameters:
loggerobject

The object that manages logging output.

preferred_sparse_format()[source]

Return the preferred sparse format for the dr/do matrix of a split jacobian.

Returns:
str

The preferred sparse format for the dr/do matrix of a split jacobian.

raise_petsc_error(e, system, matrix)[source]

Raise an error based on the issue that PETSc had with the linearize.

Parameters:
eError

Error returned by PETSc.

system<System>

System containing the Directsolver.

matrixndarray

Matrix of interest.

solve(mode, rel_systems=None)[source]

Run the solver.

Parameters:
modestr

‘fwd’ or ‘rev’.

rel_systemsset of str

Names of systems relevant to the current solve. Deprecated.

class openmdao.solvers.linear.petsc_direct_solver.PETScLU(A: spmatrix, sparse_solver_name: str = None, comm=None)[source]

Bases: object

Wrapper for PETSc LU decomposition, using petsc4py.

Parameters:
Andarray or <scipy.sparse.csc_matrix>

Matrix to use in solving x @ A == b.

sparse_solver_namestr

Name of the direct solver from PETSc to use.

comm<mpi4py.MPI.Intracomm>

The system MPI communicator.

Attributes:
orig_Andarray or <scipy.sparse.csc_matrix>

Originally provided matrix.

A<petsc4py.PETSc.Mat>

Assembled PETSc AIJ (compressed sparse row format) matrix.

ksp<petsc4py.PETSc.KSP>

PETSc Krylov Subspace Solver context.

running_mpibool

Is the script currently being run under MPI (True) or not (False).

comm<mpi4py.MPI.Intracomm>

The system MPI communicator.

_x<petsc4py.PETSc.Vec>

Sequential (non-distributed) PETSc vector to store the solve solution.

_PETSc<petsc4py.PETSc>

Lazily imported petsc4py.PETSc module.

Methods

solve(b[, transpose])

Solve the linear system using only the preconditioner direct solver.

__init__(A: spmatrix, sparse_solver_name: str = None, comm=None)[source]

Initialize and setup the PETSc LU Direct Solver object.

solve(b: ndarray, transpose: bool = False) ndarray[source]

Solve the linear system using only the preconditioner direct solver.

Parameters:
bndarray

Input data for the right hand side.

transposebool

Is A.T @ x == b being solved (True) or A @ x == b (False).

Returns:
ndarray

The solution array.

openmdao.solvers.linear.petsc_direct_solver.check_err_on_singular(name, value)[source]

Check the value of the “err_on_singular” option on the PETScDirectSolver.

Parameters:
namestr

Name of the option being checked.

valuebool

Value of the option being checked.