pymooDriver#

pymooDriver wraps the optimizer package pymoo, which provides a comprehensive suite of evolutionary and swarm-based single- and multi-objective optimization algorithms. Unlike gradient-based drivers, pymooDriver requires no derivatives from the model.

In this example, we use the DE (Differential Evolution) optimizer to minimize the objective of the Paraboloid problem.

import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid

prob = om.Problem()
model = prob.model

model.add_subsystem('comp', Paraboloid(), promotes=['*'])

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'DE'
prob.driver.options['disp'] = False
prob.driver.run_settings['seed'] = 11
prob.driver.run_settings['termination'] = ('n_gen', 100)

model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')

prob.setup()

prob.set_val('x', 50.0)
prob.set_val('y', 50.0)

prob.run_driver()
Problem: problem
Driver:  pymooDriver
  success     : True
  iterations  : 10002
  runtime     : 2.4228E+00 s
  model_evals : 10002
  model_time  : 4.3827E-01 s
  deriv_evals : 0
  deriv_time  : 0.0000E+00 s
  exit_status : SUCCESS
print(prob.get_val('x'))
print(prob.get_val('y'))
[6.66666667]
[-7.33333335]

The optimizer option selects which pymoo algorithm to use. pymooDriver supports the following optimizers (case sensitive):

Single-Objective:

  • 'GA' - Genetic Algorithm

  • 'DE' - Differential Evolution

  • 'CMAES' - Covariance Matrix Adaptation Evolution Strategy

  • 'NelderMead' - Nelder-Mead simplex algorithm

  • 'MixedVariableGA' - Genetic Algorithm with support for discrete (integer) and mixed integer design variables

  • and others — see the pymoo documentation for the full list

Multi-Objective:

  • 'NSGA2' - Non-dominated Sorting Genetic Algorithm II

  • 'NSGA3' - Non-dominated Sorting Genetic Algorithm III

  • 'MOEAD' - Multi-Objective Evolutionary Algorithm Based on Decomposition

  • 'CTAEA' - Constrained Two-Archive Evolutionary Algorithm

  • and others — see the pymoo documentation for the full list

Not all algorithms support constraints. Algorithms that explicitly assert no constraints (e.g. 'DNSGA2', 'KGB') will raise an error if constraints are added. See the pymoo algorithm table for constraint support details per algorithm. Also note that there are some algorithms available in pymoo which are not explicitly documented in the “algorithms” page in their documentation.

pymooDriver Options#

OptionDefaultAcceptable ValuesAcceptable TypesDescription
debug_print[]['desvars', 'nl_cons', 'ln_cons', 'objs', 'totals']['list']List of what type of Driver variables to print at each iteration.
dispTrueN/A['int', 'bool']Controls optimizer output verbosity. Not used if "verbose" is manually set in "self.run_settings".
invalid_desvar_behaviorwarn['warn', 'raise', 'ignore']N/ABehavior of driver if the initial value of a design variable exceeds its bounds. The default value may beset using the `OPENMDAO_INVALID_DESVAR_BEHAVIOR` environment variable to one of the valid options.
optimizerGA['CTAEA', 'NelderMead', 'PatternSearch', 'RVEA', 'SPEA2', 'RNSGA2', 'NSGA3', 'SRES', 'EPPSO', 'ISRES', 'G3PCX', 'CMOPSO', 'NicheGA', 'KGB', 'DNSGA2', 'RNSGA3', 'MOPSO_CD', 'PINSGA2', 'MixedVariableGA', 'DE', 'UNSGA3', 'NRBO', 'RandomSearch', 'ES', 'Optuna', 'BRKGA', 'SMSEMOA', 'AGEMOEA', 'NSGA2', 'MOEAD', 'GA', 'DIRECT', 'CMAES', 'AGEMOEA2', 'PSO']N/AName of optimizer to use
procs_per_model1N/AN/ANumber of processors to give each model under MPI.

pymooDriver Constructor#

The call signature for the pymooDriver constructor is:

pymooDriver.__init__(**kwargs)[source]

Initialize the pymooDriver.

Parameters:
**kwargsdict of keyword arguments

Keyword arguments that will be mapped into the Driver options.

Using pymooDriver#

pymooDriver has a small number of unified options that can be set as keyword arguments when it is instantiated or through the options dictionary. We have already shown how to set the optimizer option. The disp option controls whether pymooDriver prints a completion message when the optimization finishes and iteration level outputs.

import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid

prob = om.Problem()
model = prob.model

model.add_subsystem('comp', Paraboloid(), promotes=['*'])

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'DE'
prob.driver.options['disp'] = True
prob.driver.run_settings['seed'] = 11
# The optimization will fail at only 5 generations, but we purposely make the number
# of generations small just to demonstrate the print outs when 'disp' = True
# without flooding the page.
prob.driver.run_settings['termination'] = ('n_gen', 5)

model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')

prob.setup()
prob.run_driver()
=================================================
n_gen  |  n_eval  |     f_avg     |     f_min    
=================================================
     1 |      100 |  1.800362E+03 |  2.6639276142
     2 |      200 |  9.551352E+02 | -1.397073E+01
     3 |      300 |  5.629835E+02 | -1.885303E+01
     4 |      400 |  3.909718E+02 | -2.490437E+01
     5 |      500 |  2.936201E+02 | -2.490437E+01
Optimization Complete
-----------------------------------
Problem: problem2
Driver:  pymooDriver
  success     : True
  iterations  : 502
  runtime     : 1.2100E-01 s
  model_evals : 502
  model_time  : 2.2579E-02 s
  deriv_evals : 0
  deriv_time  : 0.0000E+00 s
  exit_status : SUCCESS

Algorithm Settings#

Each pymoo algorithm has its own set of hyperparameters that control the structure of the algorithm — for example, population size, crossover and mutation operators. These are passed to the algorithm constructor via the alg_settings dictionary. See the pymoo documentation and navigate to the page for an algorithm to see the available settings.

Here, we set the population size for the GA using the alg_settings:

import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid

prob = om.Problem()
model = prob.model

model.add_subsystem('comp', Paraboloid(), promotes=['*'])

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'GA'
prob.driver.options['disp'] = False
prob.driver.alg_settings['pop_size'] = 50
prob.driver.run_settings['seed'] = 11
prob.driver.run_settings['termination'] = ('n_gen', 400)

model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')

prob.setup()
prob.run_driver()

print(prob.get_val('x'))
print(prob.get_val('y'))
[6.66507362]
[-7.33255311]

Run Settings#

Run-level settings that control how the optimization executes — such as the random seed, verbosity, termination criterion, and callback — are passed via the run_settings dictionary. These are forwarded to pymoo.optimize.minimize() and subsequently to algorithm.setup().

Common run settings include:

  • 'seed' — integer random seed for reproducibility

  • 'verbose' — whether to print iteration-level progress (overrides disp option if set)

  • 'termination' — a pymoo termination object or shorthand tuple such as ('n_gen', 200)

  • 'save_history' — whether to store per-generation history in the results object

  • 'callback' — a pymoo callback object called after each generation

See the pymoo documentation for the minimize function for most of the available settings. Note that depending on which optimizer you choose, there may be slightly different run settings keys available which aren’t shown in that link. It is not explicitly documented in pymoo which algorithms allow exactly which run settings. Here, we set a seed for reproducibility and a generation-based termination criterion in the run_settings:

import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid

prob = om.Problem()
model = prob.model

model.add_subsystem('comp', Paraboloid(), promotes=['*'])

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'DE'
prob.driver.options['disp'] = False
prob.driver.run_settings['seed'] = 42
prob.driver.run_settings['termination'] = ('n_gen', 400)
prob.driver.run_settings['verbose'] = False

model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')

prob.setup()
prob.run_driver()

print(prob.get_val('x'))
print(prob.get_val('y'))
[6.66666665]
[-7.33333331]

Constrained Optimization#

Constraints are supported by most pymoo algorithms (see the optimizer list above). Inequality and equality constraints defined via add_constraint() are automatically translated to pymoo’s convention and passed to the algorithm. A solution is considered successful if there is a feasible solution found in the final population. Note that pymoo hard codes a small tolerance for equality constraints (1e-4) into its constraint violation calculation by default.

Here, we minimize the Paraboloid subject to an inequality constraint x + y >= 15, which forces the optimum away from the unconstrained minimum:

import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid

prob = om.Problem()
model = prob.model

model.add_subsystem('comp', Paraboloid(), promotes=['*'])
model.add_subsystem('con', om.ExecComp('g = x + y'), promotes=['*'])

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'GA'
prob.driver.options['disp'] = False
prob.driver.run_settings['seed'] = 11
prob.driver.run_settings['termination'] = ('n_gen', 400)

model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')
model.add_constraint('g', lower=15.0)

model.set_input_defaults('y', val=0.0)
model.set_input_defaults('x', val=0.0)

prob.setup()
prob.run_driver()

print(prob.get_val('x'))
print(prob.get_val('y'))
print('x + y =', prob.get_val('x') + prob.get_val('y'))
[14.62004193]
[0.37995807]
x + y = [15.]

Integer and Mixed Integer Optimization#

Discrete (integer) design variables are supported via the MixedVariableGA optimizer. This is the only pymoo algorithm that uses pymoo’s mixed-variable-aware sampling and mating operators, which are required to correctly search integer and mixed integer/continuous spaces. Using any other optimizer with discrete design variables will raise an error.

In this example, we minimize a simple function of one continuous and one integer design variable. The optimal solution is at x = 2.5, n = 3:

import openmdao.api as om


class MixedComp(om.ExplicitComponent):
    """Minimize f = (x - 2.5)^2 + (n - 3)^2 with x continuous and n integer."""

    def setup(self):
        self.add_input('x', val=0.0)
        self.add_discrete_input('n', val=0)
        self.add_output('f', val=0.0)

    def setup_partials(self):
        self.declare_partials('f', 'x')

    def compute(self, inputs, outputs, discrete_inputs, discrete_outputs):
        outputs['f'] = (inputs['x'] - 2.5)**2 + (discrete_inputs['n'] - 3)**2

    def compute_partials(self, inputs, partials, discrete_inputs):
        partials['f', 'x'] = 2.0 * (inputs['x'] - 2.5)


prob = om.Problem()
prob.model.add_subsystem('comp', MixedComp())

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'MixedVariableGA'
prob.driver.options['disp'] = False
prob.driver.run_settings['seed'] = 11
prob.driver.run_settings['termination'] = ('n_gen', 200)

prob.model.add_design_var('comp.x', lower=-10.0, upper=10.0)
prob.model.add_design_var('comp.n', lower=0, upper=10)
prob.model.add_objective('comp.f')

prob.setup()
prob.run_driver()

print('x =', prob.get_val('comp.x'))
print('n =', prob.get_val('comp.n'))
x = [2.49964893]
n = 3

Multi-Objective Optimization#

pymooDriver supports multi-objective optimization through pymoo’s Pareto-based algorithms such as NSGA2. When a multi-objective optimizer is used, the model is left at the state of the last function evaluation after run_driver() completes. The full Pareto front is stored on the driver in driver.pareto, which contains:

  • driver.pareto['X'] — dict mapping each design variable name to its values across all Pareto solutions

  • driver.pareto['F'] — dict mapping each objective name to its values across all Pareto solutions

  • driver.pareto['X_raw'] — raw design variable array, shape (n_solutions, n_vars), for use with pymoo utilities

  • driver.pareto['F_raw'] — raw objective array, shape (n_solutions, n_objs), for use with pymoo utilities

In this example, we minimize two competing objectives over a single design variable:

import numpy as np
import openmdao.api as om

prob = om.Problem()
model = prob.model

# Two competing objectives: f1 = x^2, f2 = (x - 2)^2
# Pareto front spans x in [0, 2]
exec_comp = model.add_subsystem('exec', om.ExecComp(['f1 = x**2', 'f2 = (x - 2.0)**2'],
                                                    x=0.0, f1=0.0, f2=0.0))

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'NSGA2'
prob.driver.options['disp'] = False
prob.driver.run_settings['seed'] = 42
prob.driver.run_settings['termination'] = ('n_gen', 200)

model.add_design_var('exec.x', lower=0.0, upper=3.0)
model.add_objective('exec.f1')
model.add_objective('exec.f2')

prob.setup()
prob.run_driver()
Problem: problem7
Driver:  pymooDriver
  success     : True
  iterations  : 20001
  runtime     : 5.7311E+00 s
  model_evals : 20001
  model_time  : 8.7155E-01 s
  deriv_evals : 0
  deriv_time  : 0.0000E+00 s
  exit_status : SUCCESS
from pymoo.visualization.scatter import Scatter

plot = Scatter()
plot.add(prob.driver.pareto['F_raw'], facecolor="none", edgecolor="red")
plot.show()
<pymoo.visualization.scatter.Scatter at 0x7ff3637a9a90>
../../../_images/fca9b0e351ba0fdb5e711189cbfc262a78d0a99ede06074435aef4d608756335.png

The raw pymoo result object is also available at prob.driver.pymoo_results for any algorithm so users can access additional information such as execution time, algorithm state, and population history (if save_history=True was set in run_settings).

Parallel Evaluation with MPI#

pymooDriver automatically enables population-level parallelism when more than one MPI rank is available. The population is distributed round-robin across ranks so that multiple individuals are evaluated simultaneously.

Note

MPI parallelism requires an MPI installation and must be launched with mpirun or equivalent.

Basic parallel usage#

mpirun -n 4 python my_optimization.py
prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'GA'

With 4 ranks this evaluates 4 individuals simultaneously, giving roughly a 4x speedup in function evaluations for computationally expensive models.

Models with parallel components#

If the model itself contains parallel components (e.g. ParallelGroup), set the procs_per_model option to the number of MPI ranks each model evaluation requires. The total number of ranks must be evenly divisible by procs_per_model.

prob.driver = om.pymooDriver()
prob.driver.options['optimizer'] = 'GA'
prob.driver.options['procs_per_model'] = 2  # each model uses 2 ranks

With 8 total ranks and procs_per_model=2, there are 4 groups of 2 ranks each. Each group cooperates on one model evaluation, so 4 individuals in the pymoo population are evaluated simultaneously.

mpirun -n 8 python my_optimization.py