What is OpenMDAO?
OpenMDAO is an open-source high-performance computing platform for systems analysis and multidisciplinary optimization, written in Python. It enables you to decompose your models, making them easier to build and maintain, while still solving them in a tightly coupled manner with efficient parallel numerical methods.
Why Use OpenMDAO?
The OpenMDAO project is primarily focused on supporting gradient-based optimization with analytic derivatives to allow you to explore large design spaces with hundreds or thousands of design variables, but the framework also has a number of parallel computing features that can work with gradient-free optimization, mixed-integer nonlinear programming, and traditional design space exploration.
Getting Started with OpenMDAO
From your python environment (we recommend Anaconda), just type:
>> pip install openmdao
Sample Optimization File
Here is a simple example run file to get you started running your first optimization. Copy the following code into a file named paraboloid_min.py:
from openmdao.api import Problem, ScipyOptimizeDriver, ExecComp, IndepVarComp # build the model prob = Problem() indeps = prob.model.add_subsystem('indeps', IndepVarComp()) indeps.add_output('x', 3.0) indeps.add_output('y', -4.0) prob.model.add_subsystem('paraboloid', ExecComp('f = (x-3)**2 + x*y + (y+4)**2 - 3')) prob.model.connect('indeps.x', 'paraboloid.x') prob.model.connect('indeps.y', 'paraboloid.y') prob.driver = ScipyOptimizeDriver() prob.driver.options['optimizer'] = 'SLSQP' prob.model.add_design_var('indeps.x', lower=-50, upper=50) prob.model.add_design_var('indeps.y', lower=-50, upper=50) prob.model.add_objective('paraboloid.f') prob.setup() prob.run_driver() # minimum value print(prob['paraboloid.f']) # location of the minimum print(prob['indeps.x']) print(prob['indeps.y'])
Then, to run the file, simply type:
>> python paraboloid_min.py
If all works as planned, results should appear as such:
Optimization terminated successfully. (Exit mode 0) Current function value: -27.333333333333336 Iterations: 5 Function evaluations: 6 Gradient evaluations: 5 Optimization Complete ----------------------------------- [-27.33333333] [ 6.66666667] [-7.33333333]