{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-0",
   "metadata": {
    "tags": [
     "remove-input",
     "active-ipynb",
     "remove-output"
    ]
   },
   "outputs": [],
   "source": [
    "try:\n",
    "    from openmdao.utils.notebook_utils import notebook_mode  # noqa: F401\n",
    "except ImportError:\n",
    "    !python -m pip install openmdao[notebooks]"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-1",
   "metadata": {},
   "source": [
    "# pymooDriver\n",
    "\n",
    "pymooDriver wraps the optimizer package pymoo, which provides a comprehensive suite of evolutionary and swarm-based single- and multi-objective optimization algorithms. Unlike gradient-based drivers, pymooDriver requires no derivatives from the model.\n",
    "\n",
    "In this example, we use the DE (Differential Evolution) optimizer to minimize the objective of the Paraboloid problem."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-2",
   "metadata": {
    "tags": [
     "remove-input",
     "remove-output"
    ]
   },
   "outputs": [],
   "source": [
    "from openmdao.utils.notebook_utils import get_code\n",
    "from myst_nb import glue\n",
    "glue('code_pymoo_paraboloid', get_code('openmdao.test_suite.components.paraboloid.Paraboloid'), display=False)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-3",
   "metadata": {},
   "source": [
    ":::{Admonition} `Paraboloid` class definition\n",
    ":class: dropdown\n",
    "\n",
    "{glue:}`code_pymoo_paraboloid`\n",
    ":::"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-4",
   "metadata": {},
   "outputs": [],
   "source": [
    "import openmdao.api as om\n",
    "from openmdao.test_suite.components.paraboloid import Paraboloid\n",
    "\n",
    "prob = om.Problem()\n",
    "model = prob.model\n",
    "\n",
    "model.add_subsystem('comp', Paraboloid(), promotes=['*'])\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'DE'\n",
    "prob.driver.options['disp'] = False\n",
    "prob.driver.run_settings['seed'] = 11\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 100)\n",
    "\n",
    "model.add_design_var('x', lower=-50.0, upper=50.0)\n",
    "model.add_design_var('y', lower=-50.0, upper=50.0)\n",
    "model.add_objective('f_xy')\n",
    "\n",
    "prob.setup()\n",
    "\n",
    "prob.set_val('x', 50.0)\n",
    "prob.set_val('y', 50.0)\n",
    "\n",
    "prob.run_driver()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-5",
   "metadata": {},
   "outputs": [],
   "source": [
    "print(prob.get_val('x'))\n",
    "print(prob.get_val('y'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-6",
   "metadata": {
    "tags": [
     "remove-input",
     "remove-output"
    ]
   },
   "outputs": [],
   "source": [
    "from openmdao.utils.assert_utils import assert_near_equal\n",
    "assert_near_equal(prob.get_val('x'), 6.66666667, 1e-3)\n",
    "assert_near_equal(prob.get_val('y'), -7.3333333, 1e-3)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-7",
   "metadata": {},
   "source": [
    "The `optimizer` option selects which pymoo algorithm to use. pymooDriver supports\n",
    "the following optimizers (case sensitive):\n",
    "\n",
    "**Single-Objective**:\n",
    "- `'GA'` - Genetic Algorithm\n",
    "- `'DE'` - Differential Evolution\n",
    "- `'CMAES'` - Covariance Matrix Adaptation Evolution Strategy\n",
    "- `'NelderMead'` - Nelder-Mead simplex algorithm\n",
    "- `'MixedVariableGA'` - Genetic Algorithm with support for discrete (integer) and mixed integer design variables\n",
    "- and others — see the [pymoo documentation](https://pymoo.org/algorithms/list.html) for the full list\n",
    "\n",
    "**Multi-Objective**:\n",
    "- `'NSGA2'` - Non-dominated Sorting Genetic Algorithm II\n",
    "- `'NSGA3'` - Non-dominated Sorting Genetic Algorithm III\n",
    "- `'MOEAD'` - Multi-Objective Evolutionary Algorithm Based on Decomposition\n",
    "- `'CTAEA'` - Constrained Two-Archive Evolutionary Algorithm\n",
    "- and others — see the [pymoo documentation](https://pymoo.org/algorithms/list.html) for the full list\n",
    "\n",
    "Not all algorithms support constraints. Algorithms that explicitly assert no constraints\n",
    "(e.g. `'DNSGA2'`, `'KGB'`) will raise an error if constraints are added. See the\n",
    "[pymoo algorithm table](https://pymoo.org/algorithms/list.html) for constraint support\n",
    "details per algorithm. Also note that there are some algorithms available in pymoo which are not explicitly documented in the \"algorithms\" page in their documentation."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-8",
   "metadata": {},
   "source": [
    "## pymooDriver Options"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-9",
   "metadata": {
    "tags": [
     "remove-input"
    ]
   },
   "outputs": [],
   "source": [
    "om.show_options_table('openmdao.drivers.pymoo_driver.pymooDriver')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-10",
   "metadata": {},
   "source": [
    "## pymooDriver Constructor\n",
    "\n",
    "The call signature for the *pymooDriver* constructor is:\n",
    "\n",
    "```{eval-rst}\n",
    "    .. automethod:: openmdao.drivers.pymoo_driver.pymooDriver.__init__\n",
    "       :noindex:\n",
    "```\n",
    "\n",
    "## Using pymooDriver\n",
    "\n",
    "pymooDriver has a small number of unified options that can be set as keyword arguments when it is instantiated or through the `options` dictionary. We have already shown how to set the `optimizer` option. The `disp` option controls whether pymooDriver prints a completion message when the optimization finishes and iteration level outputs."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-11",
   "metadata": {},
   "outputs": [],
   "source": [
    "import openmdao.api as om\n",
    "from openmdao.test_suite.components.paraboloid import Paraboloid\n",
    "\n",
    "prob = om.Problem()\n",
    "model = prob.model\n",
    "\n",
    "model.add_subsystem('comp', Paraboloid(), promotes=['*'])\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'DE'\n",
    "prob.driver.options['disp'] = True\n",
    "prob.driver.run_settings['seed'] = 11\n",
    "# The optimization will fail at only 5 generations, but we purposely make the number\n",
    "# of generations small just to demonstrate the print outs when 'disp' = True\n",
    "# without flooding the page.\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 5)\n",
    "\n",
    "model.add_design_var('x', lower=-50.0, upper=50.0)\n",
    "model.add_design_var('y', lower=-50.0, upper=50.0)\n",
    "model.add_objective('f_xy')\n",
    "\n",
    "prob.setup()\n",
    "prob.run_driver()"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-12",
   "metadata": {},
   "source": [
    "## Algorithm Settings\n",
    "\n",
    "Each pymoo algorithm has its own set of hyperparameters that control the structure of the algorithm — for example, population size, crossover and mutation operators. These are passed to the algorithm constructor via the `alg_settings` dictionary. See the [pymoo documentation](https://pymoo.org) and navigate to the page for an algorithm to see the available settings.\n",
    "\n",
    "Here, we set the population size for the GA using the `alg_settings`:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-13",
   "metadata": {},
   "outputs": [],
   "source": [
    "import openmdao.api as om\n",
    "from openmdao.test_suite.components.paraboloid import Paraboloid\n",
    "\n",
    "prob = om.Problem()\n",
    "model = prob.model\n",
    "\n",
    "model.add_subsystem('comp', Paraboloid(), promotes=['*'])\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'GA'\n",
    "prob.driver.options['disp'] = False\n",
    "prob.driver.alg_settings['pop_size'] = 50\n",
    "prob.driver.run_settings['seed'] = 11\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 400)\n",
    "\n",
    "model.add_design_var('x', lower=-50.0, upper=50.0)\n",
    "model.add_design_var('y', lower=-50.0, upper=50.0)\n",
    "model.add_objective('f_xy')\n",
    "\n",
    "prob.setup()\n",
    "prob.run_driver()\n",
    "\n",
    "print(prob.get_val('x'))\n",
    "print(prob.get_val('y'))"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-14",
   "metadata": {},
   "source": [
    "## Run Settings\n",
    "\n",
    "Run-level settings that control how the optimization executes — such as the random seed, verbosity, termination criterion, and callback — are passed via the `run_settings` dictionary. These are forwarded to `pymoo.optimize.minimize()` and subsequently to `algorithm.setup()`.\n",
    "\n",
    "Common run settings include:\n",
    "\n",
    "- `'seed'` — integer random seed for reproducibility\n",
    "- `'verbose'` — whether to print iteration-level progress (overrides `disp` option if set)\n",
    "- `'termination'` — a pymoo termination object or shorthand tuple such as `('n_gen', 200)`\n",
    "- `'save_history'` — whether to store per-generation history in the results object\n",
    "- `'callback'` — a pymoo callback object called after each generation\n",
    "\n",
    "See the pymoo documentation for the [minimize](https://pymoo.org/interface/minimize.html) function for most of the available settings. Note that depending on which optimizer you choose, there may be slightly different run settings keys available which aren't shown in that link. It is not explicitly documented in pymoo which algorithms allow exactly which run settings. Here, we set a seed for reproducibility and a generation-based termination criterion in the `run_settings`:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-15",
   "metadata": {},
   "outputs": [],
   "source": [
    "import openmdao.api as om\n",
    "from openmdao.test_suite.components.paraboloid import Paraboloid\n",
    "\n",
    "prob = om.Problem()\n",
    "model = prob.model\n",
    "\n",
    "model.add_subsystem('comp', Paraboloid(), promotes=['*'])\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'DE'\n",
    "prob.driver.options['disp'] = False\n",
    "prob.driver.run_settings['seed'] = 42\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 400)\n",
    "prob.driver.run_settings['verbose'] = False\n",
    "\n",
    "model.add_design_var('x', lower=-50.0, upper=50.0)\n",
    "model.add_design_var('y', lower=-50.0, upper=50.0)\n",
    "model.add_objective('f_xy')\n",
    "\n",
    "prob.setup()\n",
    "prob.run_driver()\n",
    "\n",
    "print(prob.get_val('x'))\n",
    "print(prob.get_val('y'))"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ld1qazcyj4j",
   "metadata": {},
   "source": [
    "## Constrained Optimization\n",
    "\n",
    "Constraints are supported by most pymoo algorithms (see the optimizer list above). Inequality and equality constraints defined via `add_constraint()` are automatically translated to pymoo's convention and passed to the algorithm. A solution is considered successful if there is a feasible solution found in the final population. Note that pymoo hard codes a small tolerance for equality constraints (1e-4) into its constraint violation calculation by default.\n",
    "\n",
    "Here, we minimize the Paraboloid subject to an inequality constraint `x + y >= 15`, which forces the optimum away from the unconstrained minimum:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "vc5on2mfno",
   "metadata": {},
   "outputs": [],
   "source": [
    "import openmdao.api as om\n",
    "from openmdao.test_suite.components.paraboloid import Paraboloid\n",
    "\n",
    "prob = om.Problem()\n",
    "model = prob.model\n",
    "\n",
    "model.add_subsystem('comp', Paraboloid(), promotes=['*'])\n",
    "model.add_subsystem('con', om.ExecComp('g = x + y'), promotes=['*'])\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'GA'\n",
    "prob.driver.options['disp'] = False\n",
    "prob.driver.run_settings['seed'] = 11\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 400)\n",
    "\n",
    "model.add_design_var('x', lower=-50.0, upper=50.0)\n",
    "model.add_design_var('y', lower=-50.0, upper=50.0)\n",
    "model.add_objective('f_xy')\n",
    "model.add_constraint('g', lower=15.0)\n",
    "\n",
    "model.set_input_defaults('y', val=0.0)\n",
    "model.set_input_defaults('x', val=0.0)\n",
    "\n",
    "prob.setup()\n",
    "prob.run_driver()\n",
    "\n",
    "print(prob.get_val('x'))\n",
    "print(prob.get_val('y'))\n",
    "print('x + y =', prob.get_val('x') + prob.get_val('y'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "2zf996g254i",
   "metadata": {
    "tags": [
     "remove-output",
     "remove-input"
    ]
   },
   "outputs": [],
   "source": [
    "from openmdao.utils.assert_utils import assert_near_equal\n",
    "\n",
    "# Constraint x + y >= 15 should be satisfied\n",
    "assert prob.get_val('x') + prob.get_val('y') >= 15.0 - 1e-3"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "dajko1i29w",
   "metadata": {},
   "source": [
    "## Integer and Mixed Integer Optimization\n",
    "\n",
    "Discrete (integer) design variables are supported via the `MixedVariableGA` optimizer. This is the only pymoo algorithm that uses pymoo's mixed-variable-aware sampling and mating operators, which are required to correctly search integer and mixed integer/continuous spaces. Using any other optimizer with discrete design variables will raise an error.\n",
    "\n",
    "In this example, we minimize a simple function of one continuous and one integer design variable. The optimal solution is at `x = 2.5`, `n = 3`:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "1zt4g1o9zgp",
   "metadata": {},
   "outputs": [],
   "source": [
    "import openmdao.api as om\n",
    "\n",
    "\n",
    "class MixedComp(om.ExplicitComponent):\n",
    "    \"\"\"Minimize f = (x - 2.5)^2 + (n - 3)^2 with x continuous and n integer.\"\"\"\n",
    "\n",
    "    def setup(self):\n",
    "        self.add_input('x', val=0.0)\n",
    "        self.add_discrete_input('n', val=0)\n",
    "        self.add_output('f', val=0.0)\n",
    "\n",
    "    def setup_partials(self):\n",
    "        self.declare_partials('f', 'x')\n",
    "\n",
    "    def compute(self, inputs, outputs, discrete_inputs, discrete_outputs):\n",
    "        outputs['f'] = (inputs['x'] - 2.5)**2 + (discrete_inputs['n'] - 3)**2\n",
    "\n",
    "    def compute_partials(self, inputs, partials, discrete_inputs):\n",
    "        partials['f', 'x'] = 2.0 * (inputs['x'] - 2.5)\n",
    "\n",
    "\n",
    "prob = om.Problem()\n",
    "prob.model.add_subsystem('comp', MixedComp())\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'MixedVariableGA'\n",
    "prob.driver.options['disp'] = False\n",
    "prob.driver.run_settings['seed'] = 11\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 200)\n",
    "\n",
    "prob.model.add_design_var('comp.x', lower=-10.0, upper=10.0)\n",
    "prob.model.add_design_var('comp.n', lower=0, upper=10)\n",
    "prob.model.add_objective('comp.f')\n",
    "\n",
    "prob.setup()\n",
    "prob.run_driver()\n",
    "\n",
    "print('x =', prob.get_val('comp.x'))\n",
    "print('n =', prob.get_val('comp.n'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "3vy75yhumra",
   "metadata": {
    "tags": [
     "remove-input",
     "remove-output"
    ]
   },
   "outputs": [],
   "source": [
    "from openmdao.utils.assert_utils import assert_near_equal\n",
    "\n",
    "assert_near_equal(prob.get_val('comp.x'), 2.5, 1e-2)\n",
    "assert prob.get_val('comp.n') == 3"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cell-16",
   "metadata": {},
   "source": "## Multi-Objective Optimization\n\npymooDriver supports multi-objective optimization through pymoo's Pareto-based algorithms such as NSGA2. When a multi-objective optimizer is used, the model is left at the state of the last function evaluation after `run_driver()` completes. The full Pareto front is stored on the driver in `driver.pareto`, which contains:\n\n- `driver.pareto['X']` — dict mapping each design variable name to its values across all Pareto solutions\n- `driver.pareto['F']` — dict mapping each objective name to its values across all Pareto solutions\n- `driver.pareto['X_raw']` — raw design variable array, shape `(n_solutions, n_vars)`, for use with pymoo utilities\n- `driver.pareto['F_raw']` — raw objective array, shape `(n_solutions, n_objs)`, for use with pymoo utilities\n\nIn this example, we minimize two competing objectives over a single design variable:"
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-17",
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import openmdao.api as om\n",
    "\n",
    "prob = om.Problem()\n",
    "model = prob.model\n",
    "\n",
    "# Two competing objectives: f1 = x^2, f2 = (x - 2)^2\n",
    "# Pareto front spans x in [0, 2]\n",
    "exec_comp = model.add_subsystem('exec', om.ExecComp(['f1 = x**2', 'f2 = (x - 2.0)**2'],\n",
    "                                                    x=0.0, f1=0.0, f2=0.0))\n",
    "\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'NSGA2'\n",
    "prob.driver.options['disp'] = False\n",
    "prob.driver.run_settings['seed'] = 42\n",
    "prob.driver.run_settings['termination'] = ('n_gen', 200)\n",
    "\n",
    "model.add_design_var('exec.x', lower=0.0, upper=3.0)\n",
    "model.add_objective('exec.f1')\n",
    "model.add_objective('exec.f2')\n",
    "\n",
    "prob.setup()\n",
    "prob.run_driver()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-18",
   "metadata": {},
   "outputs": [],
   "source": "from pymoo.visualization.scatter import Scatter\n\nplot = Scatter()\nplot.add(prob.driver.pareto['F_raw'], facecolor=\"none\", edgecolor=\"red\")\nplot.show()"
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cell-19",
   "metadata": {
    "tags": [
     "remove-input",
     "remove-output"
    ]
   },
   "outputs": [],
   "source": "from openmdao.utils.assert_utils import assert_near_equal\n\nx_pareto = prob.driver.pareto['X']['exec.x']\nf1_pareto = prob.driver.pareto['F']['exec.f1']\nf2_pareto = prob.driver.pareto['F']['exec.f2']\n\nassert x_pareto is not None\nassert f1_pareto is not None\n# Pareto front should span x in [0, 2]\nassert_near_equal(x_pareto.min(), 0.0, 1e-2)\nassert_near_equal(x_pareto.max(), 2.0, 1e-2)"
  },
  {
   "cell_type": "markdown",
   "id": "cell-20",
   "metadata": {},
   "source": [
    "The raw pymoo result object is also available at `prob.driver.pymoo_results` for any algorithm so users can access additional information such as execution time, algorithm state, and population history (if `save_history=True` was set in `run_settings`)."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "td11w4tnuef",
   "metadata": {},
   "source": [
    "## Parallel Evaluation with MPI\n",
    "\n",
    "pymooDriver automatically enables population-level parallelism when more than one MPI rank is available. The population is distributed round-robin across ranks so that multiple individuals are evaluated simultaneously.\n",
    "\n",
    "```{note}\n",
    "MPI parallelism requires an MPI installation and must be launched with ``mpirun`` or equivalent.\n",
    "```\n",
    "\n",
    "### Basic parallel usage\n",
    "```bash\n",
    "mpirun -n 4 python my_optimization.py\n",
    "```\n",
    "\n",
    "```python\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'GA'\n",
    "```\n",
    "\n",
    "With 4 ranks this evaluates 4 individuals simultaneously, giving roughly a 4x speedup in function evaluations for computationally expensive models.\n",
    "\n",
    "### Models with parallel components\n",
    "\n",
    "If the model itself contains parallel components (e.g. `ParallelGroup`), set the `procs_per_model` option to the number of MPI ranks each model evaluation requires. The total number of ranks must be evenly divisible by `procs_per_model`.\n",
    "\n",
    "```python\n",
    "prob.driver = om.pymooDriver()\n",
    "prob.driver.options['optimizer'] = 'GA'\n",
    "prob.driver.options['procs_per_model'] = 2  # each model uses 2 ranks\n",
    "```\n",
    "\n",
    "With 8 total ranks and `procs_per_model=2`, there are 4 groups of 2 ranks each. Each group cooperates on one model evaluation, so 4 individuals in the pymoo population are evaluated simultaneously.\n",
    "\n",
    "```bash\n",
    "mpirun -n 8 python my_optimization.py\n",
    "```"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "om-dev",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.13"
  },
  "orphan": true
 },
 "nbformat": 4,
 "nbformat_minor": 5
}