Note

This feature requires MPI, and may not be able to be run on Colab.

Adding Constraints

To add a constraint to an optimization, use the add_constraint method on System.

System.add_constraint(name, lower=None, upper=None, equals=None, ref=None, ref0=None, adder=None, scaler=None, units=None, indices=None, linear=False, parallel_deriv_color=None, cache_linear_solution=False, flat_indices=False, alias=None)[source]

Add a constraint variable to this system.

Parameters
namestr

Name of the response variable in the system.

lowerfloat or ndarray, optional

Lower boundary for the variable.

upperfloat or ndarray, optional

Upper boundary for the variable.

equalsfloat or ndarray, optional

Equality constraint value for the variable.

reffloat or ndarray, optional

Value of response variable that scales to 1.0 in the driver.

ref0float or ndarray, optional

Value of response variable that scales to 0.0 in the driver.

adderfloat or ndarray, optional

Value to add to the model value to get the scaled value for the driver. adder is first in precedence. adder and scaler are an alterantive to using ref and ref0.

scalerfloat or ndarray, optional

Value to multiply the model value to get the scaled value for the driver. scaler is second in precedence. adder and scaler are an alterantive to using ref and ref0.

unitsstr, optional

Units to convert to before applying scaling.

indicessequence of int, optional

If variable is an array, these indicate which entries are of interest for this particular response. These may be positive or negative integers.

linearbool

Set to True if constraint is linear. Default is False.

parallel_deriv_colorstr

If specified, this design var will be grouped for parallel derivative calculations with other variables sharing the same parallel_deriv_color.

cache_linear_solutionbool

If True, store the linear solution vectors for this variable so they can be used to start the next linear solution with an initial guess equal to the solution from the previous linear solve.

flat_indicesbool

If True, interpret specified indices as being indices into a flat source array.

aliasstr

Alias for this response. Necessary when adding multiple constraints on different indices or slices of a single variable.

Notes

The response can be scaled using ref and ref0. The argument ref0 represents the physical value when the scaled value is 0. The argument ref represents the physical value when the scaled value is 1. The arguments (lower, upper, equals) can not be strings or variable names.

Specifying units

You can specify units when adding a constraint. When this is done, the constraint value is converted from the target output’s units to the desired unit before giving it to the optimizer. If you also specify scaling, that scaling is applied after the unit conversion. Moreover, the upper and lower limits in the constraint definition should be specified using these units.

import openmdao.api as om

prob = om.Problem()
model = prob.model

model.add_subsystem('comp1', om.ExecComp('y1 = 2.0*x',
                                         x={'val': 2.0, 'units': 'degF'},
                                         y1={'val': 2.0, 'units': 'degF'}),
                    promotes=['x', 'y1'])

model.add_subsystem('comp2', om.ExecComp('y2 = 3.0*x',
                                         x={'val': 2.0, 'units': 'degF'},
                                         y2={'val': 2.0, 'units': 'degF'}),
                    promotes=['x', 'y2'])

model.set_input_defaults('x', 35.0, units='degF')

model.add_design_var('x', units='degC', lower=0.0, upper=100.0)
model.add_constraint('y1', units='degC', lower=0.0, upper=100.0)
model.add_objective('y2', units='degC')

prob.setup()
prob.run_driver()
print('Model variables')
print(prob.get_val('x', indices=[0]))
Model variables
[35.]
print(prob.get_val('comp2.y2', indices=[0]))
[105.]
print(prob.get_val('comp1.y1', indices=[0]))
[70.]
print('Driver variables')
dv = prob.driver.get_design_var_values()
print(dv['x'][0])
Driver variables
1.6666666666666983
obj = prob.driver.get_objective_values(driver_scaling=True)
print(obj['comp2.y2'][0])
40.555555555555586
con = prob.driver.get_constraint_values(driver_scaling=True)
print(con['comp1.y1'][0])
21.111111111111143

Using the output of a distributed component as a constraint

You can use an output of a distributed component as a constraint or an objective. OpenMDAO automatically collects the values from all processors and provides them to the driver.

Here is an example where we perform optimization on a model that contains a DistParabFeature component that is distributed. The output is declared as a inequality constraint.

[output:1]
class DistParabFeature(om.ExplicitComponent):

    def initialize(self):
        self.options.declare('arr_size', types=int, default=10,
                             desc="Size of input and output vectors.")

    def setup(self):

        arr_size = self.options['arr_size']
        self.add_input('x', val=1., distributed=False,
                       shape=arr_size)
        self.add_input('y', val=1., distributed=False,
                       shape=arr_size)

        sizes, offsets = evenly_distrib_idxs(self.comm.size, arr_size)
        self.start = offsets[self.comm.rank]
        self.end = self.start + sizes[self.comm.rank]
        self.a = -3.0 + 0.6 * np.arange(self.start,self.end)

        self.add_output('f_xy', shape=len(self.a), distributed=True)
        self.add_output('f_sum', shape=1, distributed=False)

        self.declare_coloring(wrt='*', method='fd')

    def compute(self, inputs, outputs):
        x = inputs['x'][self.start:self.end]
        y = inputs['y'][self.start:self.end]

        outputs['f_xy'] = (x + self.a)**2 + x * y + (y + 4.0)**2 - 3.0

        local_sum = np.sum(outputs['f_xy'])
        global_sum = np.zeros(1)
        self.comm.Allreduce(local_sum, global_sum, op=MPI.SUM)
        outputs['f_sum'] = global_sum
[output:2]
class DistParabFeature(om.ExplicitComponent):

    def initialize(self):
        self.options.declare('arr_size', types=int, default=10,
                             desc="Size of input and output vectors.")

    def setup(self):

        arr_size = self.options['arr_size']
        self.add_input('x', val=1., distributed=False,
                       shape=arr_size)
        self.add_input('y', val=1., distributed=False,
                       shape=arr_size)

        sizes, offsets = evenly_distrib_idxs(self.comm.size, arr_size)
        self.start = offsets[self.comm.rank]
        self.end = self.start + sizes[self.comm.rank]
        self.a = -3.0 + 0.6 * np.arange(self.start,self.end)

        self.add_output('f_xy', shape=len(self.a), distributed=True)
        self.add_output('f_sum', shape=1, distributed=False)

        self.declare_coloring(wrt='*', method='fd')

    def compute(self, inputs, outputs):
        x = inputs['x'][self.start:self.end]
        y = inputs['y'][self.start:self.end]

        outputs['f_xy'] = (x + self.a)**2 + x * y + (y + 4.0)**2 - 3.0

        local_sum = np.sum(outputs['f_xy'])
        global_sum = np.zeros(1)
        self.comm.Allreduce(local_sum, global_sum, op=MPI.SUM)
        outputs['f_sum'] = global_sum
[output:3]
class DistParabFeature(om.ExplicitComponent):

    def initialize(self):
        self.options.declare('arr_size', types=int, default=10,
                             desc="Size of input and output vectors.")

    def setup(self):

        arr_size = self.options['arr_size']
        self.add_input('x', val=1., distributed=False,
                       shape=arr_size)
        self.add_input('y', val=1., distributed=False,
                       shape=arr_size)

        sizes, offsets = evenly_distrib_idxs(self.comm.size, arr_size)
        self.start = offsets[self.comm.rank]
        self.end = self.start + sizes[self.comm.rank]
        self.a = -3.0 + 0.6 * np.arange(self.start,self.end)

        self.add_output('f_xy', shape=len(self.a), distributed=True)
        self.add_output('f_sum', shape=1, distributed=False)

        self.declare_coloring(wrt='*', method='fd')

    def compute(self, inputs, outputs):
        x = inputs['x'][self.start:self.end]
        y = inputs['y'][self.start:self.end]

        outputs['f_xy'] = (x + self.a)**2 + x * y + (y + 4.0)**2 - 3.0

        local_sum = np.sum(outputs['f_xy'])
        global_sum = np.zeros(1)
        self.comm.Allreduce(local_sum, global_sum, op=MPI.SUM)
        outputs['f_sum'] = global_sum
[output:0]
class DistParabFeature(om.ExplicitComponent):

    def initialize(self):
        self.options.declare('arr_size', types=int, default=10,
                             desc="Size of input and output vectors.")

    def setup(self):

        arr_size = self.options['arr_size']
        self.add_input('x', val=1., distributed=False,
                       shape=arr_size)
        self.add_input('y', val=1., distributed=False,
                       shape=arr_size)

        sizes, offsets = evenly_distrib_idxs(self.comm.size, arr_size)
        self.start = offsets[self.comm.rank]
        self.end = self.start + sizes[self.comm.rank]
        self.a = -3.0 + 0.6 * np.arange(self.start,self.end)

        self.add_output('f_xy', shape=len(self.a), distributed=True)
        self.add_output('f_sum', shape=1, distributed=False)

        self.declare_coloring(wrt='*', method='fd')

    def compute(self, inputs, outputs):
        x = inputs['x'][self.start:self.end]
        y = inputs['y'][self.start:self.end]

        outputs['f_xy'] = (x + self.a)**2 + x * y + (y + 4.0)**2 - 3.0

        local_sum = np.sum(outputs['f_xy'])
        global_sum = np.zeros(1)
        self.comm.Allreduce(local_sum, global_sum, op=MPI.SUM)
        outputs['f_sum'] = global_sum
%%px

import numpy as np
import openmdao.api as om

from openmdao.test_suite.components.paraboloid_distributed import DistParabFeature

size = 7

prob = om.Problem()
model = prob.model

ivc = om.IndepVarComp()
ivc.add_output('x', np.ones(size))
ivc.add_output('y', -1.42 * np.ones(size))

model.add_subsystem('p', ivc, promotes=['*'])
model.add_subsystem("parab", DistParabFeature(arr_size=size), promotes=['*'])

model.add_design_var('x', lower=-50.0, upper=50.0)
model.add_constraint('f_xy', lower=0.0)
model.add_objective('f_sum', index=-1)

prob.driver = om.pyOptSparseDriver(optimizer='SLSQP')
prob.setup()
        
prob.run_driver()
[stderr:2] /usr/share/miniconda/envs/test/lib/python3.10/site-packages/pyoptsparse/pyOpt_optimization.py:234: UserWarning: The argument `type=` is deprecated. Use `varType` in the future.
  warnings.warn("The argument `type=` is deprecated. Use `varType` in the future.")
/usr/share/miniconda/envs/test/lib/python3.10/site-packages/openmdao/core/system.py:1263: DerivativesWarning:'parab' <class DistParabFeature>: Coloring was deactivated.  Improvement of 0.0% was less than min allowed (5.0%).
[stderr:3] /usr/share/miniconda/envs/test/lib/python3.10/site-packages/pyoptsparse/pyOpt_optimization.py:234: UserWarning: The argument `type=` is deprecated. Use `varType` in the future.
  warnings.warn("The argument `type=` is deprecated. Use `varType` in the future.")
/usr/share/miniconda/envs/test/lib/python3.10/site-packages/openmdao/core/system.py:1263: DerivativesWarning:'parab' <class DistParabFeature>: Coloring was deactivated.  Improvement of 0.0% was less than min allowed (5.0%).
[stderr:1] /usr/share/miniconda/envs/test/lib/python3.10/site-packages/pyoptsparse/pyOpt_optimization.py:234: UserWarning: The argument `type=` is deprecated. Use `varType` in the future.
  warnings.warn("The argument `type=` is deprecated. Use `varType` in the future.")
/usr/share/miniconda/envs/test/lib/python3.10/site-packages/openmdao/core/system.py:1263: DerivativesWarning:'parab' <class DistParabFeature>: Coloring was deactivated.  Improvement of 0.0% was less than min allowed (5.0%).
[stderr:0] /usr/share/miniconda/envs/test/lib/python3.10/site-packages/pyoptsparse/pyOpt_optimization.py:234: UserWarning: The argument `type=` is deprecated. Use `varType` in the future.
  warnings.warn("The argument `type=` is deprecated. Use `varType` in the future.")
/usr/share/miniconda/envs/test/lib/python3.10/site-packages/openmdao/core/system.py:1263: DerivativesWarning:'parab' <class DistParabFeature>: Coloring was deactivated.  Improvement of 0.0% was less than min allowed (5.0%).
Out[3:22]: False
Out[2:22]: False
[stdout:0] 

Optimization Problem -- Optimization using pyOpt_sparse
================================================================================
    Objective Function: _objfunc

    Solution: 
--------------------------------------------------------------------------------
    Total Time:                    2.3364
       User Objective Time :       0.1740
       User Sensitivity Time :     2.1543
       Interface Time :            0.0066
       Opt Solver Time:            0.0016
    Calls to Objective Function :       7
    Calls to Sens Function :            7


   Objectives
      Index  Name                   Value
          0  parab.f_sum     1.150150E+01

   Variables (c - continuous, i - integer, d - discrete)
      Index  Name    Type      Lower Bound            Value      Upper Bound     Status
          0  p.x_0      c    -5.000000E+01     2.657527E+00     5.000000E+01           
          1  p.x_1      c    -5.000000E+01     2.604332E+00     5.000000E+01           
          2  p.x_2      c    -5.000000E+01     2.510059E+00     5.000000E+01           
          3  p.x_3      c    -5.000000E+01     1.910212E+00     5.000000E+01           
          4  p.x_4      c    -5.000000E+01     1.310076E+00     5.000000E+01           
          5  p.x_5      c    -5.000000E+01     7.099281E-01     5.000000E+01           
          6  p.x_6      c    -5.000000E+01     1.097805E-01     5.000000E+01           

   Constraints (i - inequality, e - equality)
      Index  Name       Type          Lower           Value           Upper    Status  Lagrange Multiplier (N/A)
          0  parab.f_xy    i   0.000000E+00    7.993606E-15    1.000000E+30         l    9.00000E+100
          1  parab.f_xy    i   0.000000E+00   -1.345590E-13    1.000000E+30         l    9.00000E+100
          2  parab.f_xy    i   0.000000E+00    5.963000E-01    1.000000E+30              9.00000E+100
          3  parab.f_xy    i   0.000000E+00    1.448300E+00    1.000000E+30              9.00000E+100
          4  parab.f_xy    i   0.000000E+00    2.300300E+00    1.000000E+30              9.00000E+100
          5  parab.f_xy    i   0.000000E+00    3.152300E+00    1.000000E+30              9.00000E+100
          6  parab.f_xy    i   0.000000E+00    4.004300E+00    1.000000E+30              9.00000E+100

--------------------------------------------------------------------------------
Out[1:22]: False
Out[0:22]: False
%%px

desvar = prob.get_val('p.x', get_remote=True)
obj = prob.get_val('f_sum', get_remote=True)

print(desvar)
[stdout:1] [2.65752672 2.60433212 2.51005939 1.91021206 1.31007579 0.70992813
 0.10978047]
[stdout:3] [2.65752672 2.60433212 2.51005939 1.91021206 1.31007579 0.70992813
 0.10978047]
[stdout:0] [2.65752672 2.60433212 2.51005939 1.91021206 1.31007579 0.70992813
 0.10978047]
[stdout:2] [2.65752672 2.60433212 2.51005939 1.91021206 1.31007579 0.70992813
 0.10978047]
%%px

print(obj)
[stdout:0] [11.50150011]
[stdout:1] [11.50150011]
[stdout:3] [11.50150011]
[stdout:2] [11.50150011]

Adding multiple constraints on an array variable.

Sometimes you have a variable and you would like to constrain difference parts of it in different ways. OpenMDAO maintains the constraint data in a dictionary that keys on the variable pathname, so we need to specify an additional “alias” argument when we declare any additional constraints.

In the following example, the variable “exec.z” has an equality constraint on the first point and a lower bound on the end point. We constrain them by giving the second one an alias to differentiate it from the first one.

p = om.Problem()

exec = om.ExecComp(['y = x**2',
                    'z = a + x**2'],
                a={'shape': (1,)},
                y={'shape': (101,)},
                x={'shape': (101,)},
                z={'shape': (101,)})

p.model.add_subsystem('exec', exec)

p.model.add_design_var('exec.a', lower=-1000, upper=1000)
p.model.add_objective('exec.y', index=50)
p.model.add_constraint('exec.z', indices=[0], equals=25)
p.model.add_constraint('exec.z', indices=[-1], lower=20, alias="End_Constraint")

p.driver = om.pyOptSparseDriver()
p.driver.options['optimizer'] = 'SLSQP'

p.setup()

p.set_val('exec.x', np.linspace(-10, 10, 101))

p.run_driver()

print(p.get_val('exec.z')[0], 25)
print(p.get_val('exec.z')[50], -75)
Optimization Problem -- Optimization using pyOpt_sparse
================================================================================
    Objective Function: _objfunc

    Solution: 
--------------------------------------------------------------------------------
    Total Time:                    0.0056
       User Objective Time :       0.0009
       User Sensitivity Time :     0.0034
       Interface Time :            0.0010
       Opt Solver Time:            0.0004
    Calls to Objective Function :       2
    Calls to Sens Function :            1


   Objectives
      Index  Name              Value
          0  exec.y     0.000000E+00

   Variables (c - continuous, i - integer, d - discrete)
      Index  Name       Type      Lower Bound            Value      Upper Bound     Status
          0  exec.a_0      c    -1.000000E+03    -7.500000E+01     1.000000E+03           

   Constraints (i - inequality, e - equality)
      Index  Name           Type          Lower           Value           Upper    Status  Lagrange Multiplier (N/A)
          0  exec.z            e   2.500000E+01    2.500000E+01    2.500000E+01              9.00000E+100
          1  End_Constraint    i   2.000000E+01    2.500000E+01    1.000000E+30              9.00000E+100

--------------------------------------------------------------------------------
25.0 25
-75.0 -75
/usr/share/miniconda/envs/test/lib/python3.10/site-packages/pyoptsparse/pyOpt_optimization.py:234: UserWarning: The argument `type=` is deprecated. Use `varType` in the future.
  warnings.warn("The argument `type=` is deprecated. Use `varType` in the future.")
/usr/share/miniconda/envs/test/lib/python3.10/site-packages/openmdao/core/total_jac.py:1567: DerivativesWarning:Constraints or objectives [('exec.y', inds=[50])] cannot be impacted by the design variables of the problem.