ScipyOptimizeDriver#
ScipyOptimizeDriver wraps the optimizers in scipy.optimize.minimize. In this example, we use the SLSQP optimizer to find the minimum of the Paraboloid problem.
Paraboloid
class definition
class Paraboloid(om.ExplicitComponent):
"""
Evaluates the equation f(x,y) = (x-3)^2 + xy + (y+4)^2 - 3.
"""
def setup(self):
self.add_input('x', val=0.0)
self.add_input('y', val=0.0)
self.add_output('f_xy', val=0.0)
def setup_partials(self):
self.declare_partials('*', '*')
def compute(self, inputs, outputs):
"""
f(x,y) = (x-3)^2 + xy + (y+4)^2 - 3
Optimal solution (minimum): x = 6.6667; y = -7.3333
"""
x = inputs['x']
y = inputs['y']
outputs['f_xy'] = (x-3.0)**2 + x*y + (y+4.0)**2 - 3.0
def compute_partials(self, inputs, partials):
"""
Jacobian for our paraboloid.
"""
x = inputs['x']
y = inputs['y']
partials['f_xy', 'x'] = 2.0*x - 6.0 + y
partials['f_xy', 'y'] = 2.0*y + 8.0 + x
import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid
prob = om.Problem()
model = prob.model
model.add_subsystem('comp', Paraboloid(), promotes=['*'])
prob.driver = om.ScipyOptimizeDriver()
prob.driver.options['optimizer'] = 'SLSQP'
prob.driver.options['tol'] = 1e-9
prob.driver.options['disp'] = True
model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')
prob.setup()
prob.set_val('x', 50.0)
prob.set_val('y', 50.0)
prob.run_driver()
Optimization terminated successfully (Exit mode 0)
Current function value: -27.33333333333333
Iterations: 3
Function evaluations: 4
Gradient evaluations: 3
Optimization Complete
-----------------------------------
Problem: problem
Driver: ScipyOptimizeDriver
success : True
iterations : 5
runtime : 3.3725E-03 s
model_evals : 5
model_time : 3.7742E-04 s
deriv_evals : 3
deriv_time : 1.0325E-03 s
exit_status : SUCCESS
print(prob.get_val('x'))
[6.66666667]
print(prob.get_val('y'))
[-7.33333333]
ScipyOptimizeDriver Options#
Option | Default | Acceptable Values | Acceptable Types | Description |
---|---|---|---|---|
debug_print | [] | ['desvars', 'nl_cons', 'ln_cons', 'objs', 'totals'] | ['list'] | List of what type of Driver variables to print at each iteration. |
disp | True | [True, False] | ['bool'] | Set to False to prevent printing of Scipy convergence messages |
invalid_desvar_behavior | warn | ['warn', 'raise', 'ignore'] | N/A | Behavior of driver if the initial value of a design variable exceeds its bounds. The default value may beset using the `OPENMDAO_INVALID_DESVAR_BEHAVIOR` environment variable to one of the valid options. |
maxiter | 200 | N/A | N/A | Maximum number of iterations. |
optimizer | SLSQP | ['Newton-CG', 'basinhopping', 'Powell', 'shgo', 'TNC', 'differential_evolution', 'L-BFGS-B', 'BFGS', 'trust-constr', 'dual_annealing', 'SLSQP', 'Nelder-Mead', 'COBYLA', 'CG'] | N/A | Name of optimizer to use |
singular_jac_behavior | warn | ['error', 'warn', 'ignore'] | N/A | Defines behavior of a zero row/col check after first call tocompute_totals:error - raise an error.warn - raise a warning.ignore - don't perform check. |
singular_jac_tol | 1e-16 | N/A | N/A | Tolerance for zero row/column check. |
tol | 1e-06 | N/A | N/A | Tolerance for termination. For detailed control, use solver-specific options. |
ScipyOptimizeDriver Constructor#
The call signature for the ScipyOptimizeDriver constructor is:
- ScipyOptimizeDriver.__init__(**kwargs)[source]
Initialize the ScipyOptimizeDriver.
ScipyOptimizeDriver Option Examples#
optimizer
The “optimizer” option lets you choose which optimizer to use. ScipyOptimizeDriver supports all of the optimizers in scipy.optimize except for ‘dogleg’ and ‘trust-ncg’. Generally, the optimizers that you are most likely to use are “COBYLA” and “SLSQP”, as these are the only ones that support constraints. Only SLSQP supports equality constraints. SLSQP also uses gradients provided by OpenMDAO, while COBYLA is gradient-free. Also, SLSQP supports both equality and inequality constraints, but COBYLA only supports inequality constraints.
Here we pass the optimizer option as a keyword argument.
import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid
prob = om.Problem()
model = prob.model
model.add_subsystem('comp', Paraboloid(), promotes=['*'])
prob.driver = om.ScipyOptimizeDriver(optimizer='COBYLA')
model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')
prob.setup()
prob.set_val('x', 50.0)
prob.set_val('y', 50.0)
prob.run_driver()
Optimization Complete
-----------------------------------
Normal return from subroutine COBYLA
NFVALS = 124 F =-2.733333E+01 MAXCV = 0.000000E+00
X = 6.666667E+00 -7.333332E+00
Problem: problem2
Driver: ScipyOptimizeDriver
success : True
iterations : 125
runtime : 3.1169E-02 s
model_evals : 125
model_time : 6.5064E-03 s
deriv_evals : 0
deriv_time : 0.0000E+00 s
exit_status : SUCCESS
print(prob.get_val('x'))
[6.66666669]
print(prob.get_val('y'))
[-7.33333239]
maxiter
The “maxiter” option is used to specify the maximum number of major iterations before termination. It is generally a valid option across all of the available options.
import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid
prob = om.Problem()
model = prob.model
model.add_subsystem('comp', Paraboloid(), promotes=['*'])
prob.driver = om.ScipyOptimizeDriver()
prob.driver.options['maxiter'] = 20
model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')
prob.setup()
prob.set_val('x', 50.0)
prob.set_val('y', 50.0)
prob.run_driver()
Optimization terminated successfully (Exit mode 0)
Current function value: -27.33333333333333
Iterations: 3
Function evaluations: 4
Gradient evaluations: 3
Optimization Complete
-----------------------------------
Problem: problem3
Driver: ScipyOptimizeDriver
success : True
iterations : 5
runtime : 3.4351E-03 s
model_evals : 5
model_time : 3.5107E-04 s
deriv_evals : 3
deriv_time : 8.4150E-04 s
exit_status : SUCCESS
print(prob.get_val('x'))
[6.66666667]
print(prob.get_val('y'))
[-7.33333333]
tol
The “tol” option allows you to specify the tolerance for termination.
import openmdao.api as om
from openmdao.test_suite.components.paraboloid import Paraboloid
prob = om.Problem()
model = prob.model
model.add_subsystem('comp', Paraboloid(), promotes=['*'])
prob.driver = om.ScipyOptimizeDriver()
prob.driver.options['tol'] = 1.0e-9
model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_design_var('y', lower=-50.0, upper=50.0)
model.add_objective('f_xy')
prob.setup()
prob.set_val('x', 50.0)
prob.set_val('y', 50.0)
prob.run_driver()
Optimization terminated successfully (Exit mode 0)
Current function value: -27.33333333333333
Iterations: 3
Function evaluations: 4
Gradient evaluations: 3
Optimization Complete
-----------------------------------
Problem: problem4
Driver: ScipyOptimizeDriver
success : True
iterations : 5
runtime : 3.2237E-03 s
model_evals : 5
model_time : 3.1978E-04 s
deriv_evals : 3
deriv_time : 8.6615E-04 s
exit_status : SUCCESS
print(prob.get_val('x'))
[6.66666667]
print(prob.get_val('y'))
[-7.33333333]
ScipyOptimizeDriver Driver Specific Options#
Optimizers in scipy.optimize.minimize have optimizer specific options. To let the user specify values for these options, OpenMDAO provides an option in the form of a dictionary named opt_settings. See the scipy.optimize.minimize documentation for more information about the driver specific options that are available.
As an example, here is code using some opt_settings for the shgo optimizer:
# Source of example: https://stefan-endres.github.io/shgo/
import numpy as np
import openmdao.api as om
size = 3 # size of the design variable
def rastrigin(x):
a = 10 # constant
return np.sum(np.square(x) - a * np.cos(2 * np.pi * x)) + a * np.size(x)
class Rastrigin(om.ExplicitComponent):
def setup(self):
self.add_input('x', np.ones(size))
self.add_output('f', 0.0)
self.declare_partials(of='f', wrt='x', method='cs')
def compute(self, inputs, outputs, discrete_inputs=None, discrete_outputs=None):
x = inputs['x']
outputs['f'] = rastrigin(x)
prob = om.Problem()
model = prob.model
model.add_subsystem('rastrigin', Rastrigin(), promotes=['*'])
prob.driver = driver = om.ScipyOptimizeDriver()
driver.options['optimizer'] = 'shgo'
driver.options['disp'] = False
driver.opt_settings['maxtime'] = 10 # seconds
driver.opt_settings['iters'] = 3
driver.opt_settings['maxiter'] = None
model.add_design_var('x', lower=-5.12*np.ones(size), upper=5.12*np.ones(size))
model.add_objective('f')
prob.setup()
prob.set_val('x', 2*np.ones(size))
prob.run_driver()
/usr/share/miniconda/envs/test/lib/python3.11/site-packages/openmdao/core/total_jac.py:1728: DerivativesWarning:Constraints or objectives [('f', inds=[0])] cannot be impacted by the design variables of the problem.
/usr/share/miniconda/envs/test/lib/python3.11/site-packages/openmdao/core/total_jac.py:1754: DerivativesWarning:Design variables [('x', inds=[0, 1, 2])] have no impact on the constraints or objective.
Problem: problem5
Driver: ScipyOptimizeDriver
success : True
iterations : 50
runtime : 2.5171E-02 s
model_evals : 50
model_time : 3.4392E-03 s
deriv_evals : 15
deriv_time : 4.6534E-03 s
exit_status : SUCCESS
print(prob.get_val('x'))
[0. 0. 0.]
print(prob.get_val('f'))
[0.]
Notice that when using the shgo optimizer, setting the opt_settings[‘maxiter’] to None overrides ScipyOptimizeDriver’s options[‘maxiter’] value. It is not possible to set options[‘maxiter’] to anything other than an integer so the opt_settings[‘maxiter’] option provides a way to set the maxiter value for the shgo optimizer to None.