Package openmdao.main

This package contains the openmdao framework infrastructure code.

assembly.py

Class definition for Assembly

class openmdao.main.assembly.Assembly(doc=None, directory='')[source]

Bases: openmdao.main.component.Component

This is a container of Components. It understands how to connect inputs and outputs between its children. When executed, it runs the top level Driver called ‘driver’.

Instance (Driver) driver

The top level Driver that manages execution of this Assembly

  • copy: ‘deep’
add(name, obj)[source]

Add obj to the component graph and call base class add.

Returns the added object.

check_resolve(pathnames)[source]

Returns True if all of the pathnames are resolvable starting from this Assembly.

connect(srcpath, destpath)[source]

Connect one src Variable to one destination Variable. This could be a normal connection (output to input) or a passthrough connection.

create_passthrough(pathname, alias=None)[source]

Creates a PassthroughTrait that uses the trait indicated by pathname for validation (if it’s not a property trait), adds it to self, and creates a connection between the two. If alias is None, the name of the “promoted” trait will be the last entry in its pathname. The trait specified by pathname must exist.

disconnect(varpath, varpath2=None)[source]

If varpath2 is supplied, remove the connection between varpath and varpath2. Otherwise, if varpath is the name of a trait, remove all connections to/from varpath in the current scope. If varpath is the name of a Component, remove all connections from all of its inputs and outputs.

exec_counts(compnames)[source]
execute()[source]

Runs driver and updates our boundary variables.

get_dyn_trait(pathname, iotype=None)[source]

Retrieves the named trait, attempting to create a PassthroughTrait on-the-fly if the specified trait doesn’t exist.

get_valids(names)[source]

Returns a list of boolean values indicating whether the named variables are valid (True) or invalid (False). Entries in names may specify either direct traits of self or those of direct children of self, but no deeper in the hierarchy than that.

invalidate_deps(compname=None, varnames=None, notify_parent=False)[source]

Mark all Variables invalid that depend on varnames. Returns a list of our newly invalidated boundary outputs.

is_destination(varpath)[source]

Return True if the Variable specified by varname is a destination according to our graph. This means that either it’s an input connected to an output, or it’s the destination part of a passthrough connection.

list_connections(show_passthrough=True)[source]

Return a list of tuples of the form (outvarname, invarname).

remove(name)[source]

Remove the named container object from this assembly and remove it from its workflow (if any).

step()[source]

Execute a single child component and return.

stop()[source]

Stop the calculation.

update_inputs(compname, varnames)[source]

Transfer input data to input variables on the specified component. The varnames iterator is assumed to contain local names (no component name), for example: [‘a’, ‘b’].

update_outputs(outnames)[source]

Execute any necessary internal or predecessor components in order to make the specified output variables valid.

case.py

class openmdao.main.case.Case(inputs=None, outputs=None, max_retries=None, retries=None, msg='', ident='')[source]

Bases: object

Contains all information necessary to specify an input case, i.e., a list of name, index, value tuples for all inputs to the case, all outputs collected after running the case, an indicator of the exit status of the case, a string containing error messages associated with the running of the case, and an optional case identifier. The value entry of output tuples should be set to None prior to executing the case.

add_input(name, value, index=None)[source]

Convenience function for adding an input

add_output(name, index=None)[source]

Convenience function for adding an output

apply_inputs(scope)[source]

Set all of the inputs in this case to their specified values in the given scope.

update_outputs(scope, msg=None)[source]

Update the value of all outputs of interest, using the given scope, and/or set error message.

caseiter.py

openmdao.main.caseiter.caseiter_to_dict(caseiter, varnames, include_errors=False)[source]

Retrieve the values of specified variables from cases in a CaseIterator.

Returns a dict containing a list of values for each entry, keyed on variable name.

Only data from cases containing ALL of the specified variables will be returned so that all data values with the same index will correspond to the same case.

caseiter: CaseIterator
A CaseIterator containing the cases of interest.
varnames: list[str]
iterator of names of variables to be retrieved.
include_errors: bool, optional [False]
If True, include data from cases that reported an error.

component.py

Class definition for Component

class openmdao.main.component.Component(doc=None, directory='')[source]

Bases: openmdao.main.container.Container

This is the base class for all objects containing Traits that are accessible to the OpenMDAO framework and are “runnable.”

Str directory

If non-blank, the directory to execute in.

  • iotype: ‘in’
Bool force_execute

If True, always execute even if all IO traits are valid.

  • iotype: ‘in’
List external_files

FileMetadata objects for external files used by this component

  • copy: ‘deep’
add(name, obj)[source]

Override of base class version to force call to check_config after any child containers are added. Returns the added Container object.

add_trait(name, trait)[source]

Overrides base definition of add_trait in order to force call to check_config prior to execution when new traits are added.

check_config()[source]

Verify that this component is fully configured to execute. This function is called once prior to the first execution of this component and may be called explicitly at other times if desired. Classes that override this function must still call the base class version.

check_path(path, check_dir=False)[source]

Verify that the given path is a directory and is located within the allowed area (somewhere within the simulation root path).

checkpoint(outstream, fmt=4)[source]

Save sufficient information for a restart. By default, this just calls save().

config_changed(update_parent=True)[source]

Call this whenever the configuration of this Component changes, for example, children are added or removed.

execute()[source]

Perform calculations or other actions, assuming that inputs have already been set. This must be overridden in derived classes.

get_abs_directory()[source]

Return absolute path of execution directory.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency resulting from ExprEvaluators in this Component.

get_expr_sources()[source]

Return a list of tuples containing the names of all upstream components that are referenced in any of our objectives, along with an initial exec_count of 0.

get_file_vars()[source]

Return list of (filevarname,filevarvalue,file trait) owned by this component.

get_valid(name)[source]

Get the value of the validity flag for the io trait with the given name.

get_valids(names)[source]

Get a list of validity flags for the io traits with the given names.

invalidate_deps(varnames=None, notify_parent=False)[source]

Invalidate all of our outputs if they’re not invalid already. For a typical Component, this will always be all or nothing, meaning there will never be partial validation of outputs. Components supporting partial output validation must override this function.

Returns None, indicating that all outputs are invalidated.

is_valid()[source]

Return False if any of our variables is invalid.

list_containers()[source]

Return a list of names of child Containers.

list_inputs(valid=None)[source]

Return a list of names of input values. If valid is not None, the the list will contain names of inputs with matching validity.

list_outputs(valid=None)[source]

Return a list of names of output values. If valid is not None, the the list will contain names of outputs with matching validity.

static load(instream, fmt=4, package=None, call_post_load=True, top_obj=True, name='', observer=None)[source]

Load object(s) from instream. If instream is an installed package name, then any external files referenced in the object(s) are copied from the package installation to appropriate directories. If an external file has metadata attribute ‘constant’ == True and the machine supports it, a symlink is used rather than a file copy. The package and top_obj arguments are normally used by a loader script (generated by save_to_egg()) to load a sub-component from the egg. name is set when creating an instance via a factory. In this case, external files are copied to a name directory and the component’s directory attribute set accordingly. Existing files are not overwritten. Returns the root object.

instream: file or string
Stream to load from.
fmt: int
Format of state data.
package: string
Name of package to look for instream if instream is a string that is not an existing file.
call_post_load: bool
If True, call post_load().
top_obj: bool
Set True if loading the default entry, False if loading a child entry point object.
name: string
Name for the root object
observer: callable
Will be called via an EggObserver.
new_trait(name)[source]
pop_dir()[source]

Return to previous directory saved by push_dir().

push_dir(directory=None)[source]

Change directory to dir, remembering the current directory for a later pop_dir(). Returns the new absolute directory path.

remove(name)[source]

Override of base class version to force call to check_config after any child containers are removed.

remove_trait(name)[source]

Overrides base definition of add_trait in order to force call to check_config prior to execution when a trait is removed.

restart(instream)[source]

Restore state using a checkpoint file. The checkpoint file is typically a delta from a full saved state file. If checkpoint is overridden, this should also be overridden.

run(force=False)[source]

Run this object. This should include fetching input variables, executing, and updating output variables. Do not override this function.

save_to_egg(name, version, py_dir=None, require_relpaths=True, child_objs=None, dst_dir=None, fmt=4, proto=-1, use_setuptools=False, observer=None)[source]
Save state and other files to an egg. Typically used to copy all or

part of a simulation to another user or machine. By specifying child components in child_objs, it will be possible to create instances of just those components from the installed egg. Child component names should be specified relative to this component.

name: string
Name for egg, must be an alphanumeric string.
version: string
Version for egg, must be an alphanumeric string.
py_dir: string
The (root) directory for local Python files. It defaults to the current directory.
require_relpaths: bool
If True, any path (directory attribute, external file, or file trait) which cannot be made relative to this component’s directory will raise ValueError. Otherwise such paths generate a warning and the file is skipped.
child_objs: list
List of child objects for additional entry points.
dst_dir: string
The directory to write the egg in.
fmt: int
Passed to eggsaver.save().
proto: int
Passed to eggsaver.save().
use_setuptools: bool
Passed to eggsaver.save_to_egg().
observer: callable
Will be called via an EggObserver.

After collecting files and possibly modifying their paths, this calls container.save_to_egg(). Returns (egg_filename, required_distributions, orphan_modules).

set_valids(names, valid)[source]

Mark the io traits with the given names as valid or invalid.

step()[source]

For Components that run other components (e.g., Assembly or Drivers), this will run one Component and return. For simple components, it is the same as run().

stop()[source]

Stop this component.

tree_rooted()[source]

Calls the base class version of tree_rooted(), checks our directory for validity, and creates the directory if it doesn’t exist.

update_outputs(outnames)[source]

Do what is necessary to make the specified output Variables valid. For a simple Component, this will result in a run().

dir_context[source]

The DirectoryContext for this component.

class openmdao.main.component.SimulationRoot[source]

Bases: object

Singleton object used to hold root directory.

static chroot(path)[source]

Change to directory path and set the singleton’s root. Normally not called but useful in special situations.

path: string
Path to move to.
static get_root()[source]

Return this simulation’s root directory path.

static legal_path(path)[source]

Return True if path is legal (descendant of our root).

path: string
Path to check.

constants.py

Package containing various constants/enumerations.

Currently it only includes: SAVE_YAML, SAVE_LIBYAML, SAVE_PICKLE, SAVE_CPICKLE

container.py

The Container class

class openmdao.main.container.Container(doc=None)[source]

Bases: enthought.traits.has_traits.HasTraits

Base class for all objects having Traits that are visible to the framework

add(name, obj, **kw_args)[source]

Add a Container object to this Container. Returns the added Container object.

add_trait(name, trait)[source]

Overrides HasTraits definition of add_trait in order to keep track of dynamically added traits for serialization.

contains(path)[source]

Return True if the child specified by the given dotted path name is contained in this Container.

get(path, index=None)[source]

Return the object specified by the given path, which may contain ‘.’ characters.

get_dyn_trait(name, iotype=None)[source]

Retrieves the named trait, attempting to create it on-the-fly if it doesn’t already exist.

get_metadata(traitpath, metaname=None)[source]

Retrieve the metadata associated with the trait found using traitpath. If metaname is None, return the entire metadata dictionary for the specified trait. Otherwise, just return the specified piece of metadata. If the specified piece of metadata is not part of the trait, None is returned.

get_pathname(rel_to_scope=None)[source]

Return full path name to this container, relative to scope rel_to_scope. If rel_to_scope is None, return the full pathname.

get_wrapped_attr(name)[source]

If the named trait can return a TraitValMetaWrapper, then this function will return that, with the value set to the current value of the named variable. Otherwise, it functions like getattr, just returning the value of the named variable. Raises an exception if the named trait cannot be found. The value will be copied if the trait has a ‘copy’ metadata attribute that is not None. Possible values for ‘copy’ are ‘shallow’ and ‘deep’.

invoke(path, *args, **kwargs)[source]

Call the callable specified by path, which may be a simple name or a dotted path, passing the given arguments to it, and return the result.

items(recurse=False, **metadata)[source]

Return a list of tuples of the form (rel_pathname, obj) for each trait of this Container that matches the given metadata. If recurse is True, also iterate through all child Containers of each Container found.

keys(recurse=False, **metadata)[source]

Return a list of the relative pathnames of children of this Container that match the given metadata. If recurse is True, child Containers will also be iterated over.

list_containers()[source]

Return a list of names of child Containers.

static load(instream, fmt=4, package=None, call_post_load=True, name=None)[source]

Load object(s) from the input stream. Pure Python classes generally won’t need to override this, but extensions will. The format can be supplied in case something other than cPickle is needed.

instream: file or string
Stream to load from.
fmt: int
Format of state data.
package: string
Name of package to look for instream, if instream is a string that is not an existing file.
call_post_load: bool
If True, call post_load().
name: string
Name for root object

Returns the root object.

static load_from_eggfile(filename, observer=None)[source]

Extract files in egg to a subdirectory matching the saved object name and then load object graph state.

filename: string
Name of egg file to be loaded.
observer: callable
Will be called via an EggObserver.

Returns the root object.

static load_from_eggpkg(package, entry_name=None, instance_name=None, observer=None)[source]

Load object graph state by invoking the given package entry point. If specified, the root object is renamed to instance_name.

package: string
Package name.
entry_name: string
Name of entry point.
instance_name: string
Name for root object.
observer: callable
Will be called via an EggObserver.

Returns the root object.

post_load()[source]

Perform any required operations after model has been loaded.

pre_delete()[source]

Perform any required operations before the model is deleted.

raise_exception(msg, exception_class=<type 'exceptions.Exception'>)[source]

Raise an exception.

remove(name)[source]

Remove the specified child from this container and remove any public trait objects that reference that child. Notify any observers.

remove_source(destination)[source]

Remove the source from the given destination io trait. This will allow the destination to later be connected to a different source or to have its value directly set.

remove_trait(name)[source]

Overrides HasTraits definition of remove_trait in order to keep track of dynamically added traits for serialization.

revert_to_defaults(recurse=True)[source]

Sets the values of all of the inputs to their default values.

save(outstream, fmt=4, proto=-1)[source]

Save the state of this object and its children to the given output stream. Pure Python classes generally won’t need to override this because the base class version will suffice, but Python extension classes will have to override. The format can be supplied in case something other than cPickle is needed.

outstream: file or string
Stream to save to.
fmt: int
Format for saved data.
proto: int
Protocol used.
save_to_egg(name, version, py_dir=None, src_dir=None, src_files=None, child_objs=None, dst_dir=None, fmt=4, proto=-1, use_setuptools=False, observer=None)[source]

Save state and other files to an egg. Typically used to copy all or part of a simulation to another user or machine. By specifying child containers in child_objs, it will be possible to create instances of just those containers from the installed egg. Child container names should be specified relative to this container.

name: string
Name for egg, must be an alphanumeric string.
version: string
Version for egg, must be an alphanumeric string.
py_dir: string
The (root) directory for local Python files. It defaults to the current directory.
src_dir: string
The root of all (relative) src_files.
src_files: list
List of paths to files to be included in the egg.
child_objs: list
List of child objects for additional entry points.
dst_dir: string
The directory to write the egg in.
fmt: int
Passed to eggsaver.save().
proto: int
Passed to eggsaver.save().
use_setuptools: bool
Passed to eggsaver.save_to_egg().
observer: callable
Will be called via an EggObserver.

After collecting entry point information, calls eggsaver.save_to_egg(). Returns (egg_filename, required_distributions, orphan_modules).

set(path, value, index=None, srcname=None, force=False)[source]

Set the value of the Variable specified by the given path, which may contain ‘.’ characters. The Variable will be set to the given value, subject to validation and constraints. index, if not None, should be a list of ints, at most one for each array dimension of the target value.

set_source(name, source)[source]

Mark the named io trait as a destination by registering a source for it, which will prevent it from being set directly or connected to another source.

tree_rooted()[source]

Called after the hierarchy containing this Container has been defined back to the root. This does not guarantee that all sibling Containers have been defined. It also does not guarantee that this component is fully configured to execute. Classes that override this function must call their base class version.

This version calls tree_rooted() on all of its child Containers.

values(recurse=False, **metadata)[source]

Return a list of children of this Container that have matching trait metadata. If recurse is True, child Containers will also be iterated over.

name

Name of the Container

openmdao.main.container.set_as_top(cont)[source]

Specifies that the given Container is the top of a Container hierarchy.

openmdao.main.container.get_default_name(obj, scope)[source]

Return a unique name for the given object in the given scope.

openmdao.main.container.dump(cont, recurse=False, stream=None)[source]

Print all items having iotype metadata and their corresponding values to the given stream. If the stream is not supplied, it defaults to sys.stdout.

dataflow.py

class openmdao.main.dataflow.Dataflow(parent, scope=None, members=None)[source]

Bases: openmdao.main.seqentialflow.SequentialWorkflow

A Dataflow consists of a collection of Components which are executed in data flow order.

add(comp)[source]

Add a new component to the workflow.

config_changed()[source]

Notifies the Workflow that workflow configuration (dependencies, etc.) has changed.

remove(comp)[source]

Remove a component from this Workflow

scope[source]

driver.py

Driver class definition

class openmdao.main.driver.Driver(doc=None)[source]

Bases: openmdao.main.component.Component

A Driver iterates over a workflow of Components until some condition is met.

Instance (ICaseRecorder) recorder

Case recorder for iteration data.

  • required: False
  • copy: ‘deep’
Instance (Workflow) workflow

None

  • copy: ‘deep’
add_event(name)

Adds an event variable to be set by the driver.

name: string
name of the event variable the driver should set during execution
add_workflow(name, wf)[source]

Add a new Workflow with the given name to this Driver

clear_events()

Remove all event variables from the driver’s list.

config_changed()[source]

Call this whenever the configuration of this Component changes, for example, children are added or removed.

continue_iteration()[source]

Return False to stop iterating.

execute()[source]

Iterate over a workflow of Components until some condition is met. If you don’t want to structure your driver to use pre_iteration, post_iteration, etc., just override this function. As a result, none of the <start/pre/post/continue>_iteration() functions will be called.

get_events()

Return the list of event variables to be set by this driver.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by any ExprEvaluators in this Driver, ignoring any dependencies on components that are inside of this Driver’s iteration set.

is_valid()[source]

Return False if any Component in our workflow(s) is invalid, or if any of our variables is invalid, or if any variable referenced by any of our Expressions is invalid.

iteration_set()[source]

Return a set of all Components in our workflow(s), and recursively in any workflow in any Driver in our workflow(s).

post_iteration()[source]

Called after each iteration.

pre_iteration()[source]

Called prior to each iteration. This is where iteration events are set.

remove_event(name)

Remove the name of the specified event variable from the driver’s list of event variables to be set during execution.

remove_from_workflow(component)[source]

Remove the specified component from our workflow(s).

remove_workflow(name)[source]
run_iteration()[source]

Runs workflow

set_events()

Set all events in the event list.

start_iteration()[source]

Called just prior to the beginning of an iteration loop. This can be overridden by inherited classes. It can be used to perform any necessary pre-iteration initialization.

step()[source]

Similar to the ‘execute’ function, but this one only executes a single Component from the workflow each time it’s called.

stop()[source]

eggchecker.py

openmdao.main.eggchecker.check_save_load(comp, py_dir=None, test_dir='test_dir', cleanup=True, fmt=4, logfile=None)[source]

Convenience routine to check that saving & reloading comp works.

comp: Component
The component to check.
py_dir: string or None
The directory in which to find local Python modules.
test_dir: string
Name of a scratch directory to unpack in.
cleanup: bool
If True, the scratch directory will be removed after the test.
fmt: int
The format for the saved state file.
logfile: string or None
Name of file for logging progress.

Creates an egg in the current directory, unpacks it in test_dir via a separate process, and then loads and runs the component in another subprocess. Returns the first non-zero subprocess exit code, or zero if everything succeeded.

exceptions.py

Exception classes for OpenMDAO

exception openmdao.main.exceptions.ConstraintError(msg)[source]

Bases: exceptions.ValueError

Raised when a constraint is violated.

exception openmdao.main.exceptions.CircularDependencyError(msg)[source]

Bases: exceptions.RuntimeError

Raised when a circular dependency occurs.

exception openmdao.main.exceptions.RunInterrupted(msg)[source]

Bases: exceptions.RuntimeError

Raised when run() was interrupted, implying an inconsistent state.

exception openmdao.main.exceptions.RunStopped(msg)[source]

Bases: exceptions.RuntimeError

Raised when run() was stopped, implying a consistent state but not necessarily reflecting input values.

expression.py

A variable that references another member of the OpenMDAO model hierarchy.

expreval.py

class openmdao.main.expreval.ExprEvaluator[source]

Bases: str

A class that translates an expression string into a new string containing any necessary framework access functions, e.g., set, get. The compiled bytecode is stored within the object so that it doesn’t have to be reparsed during later evaluations. A scoping object is required at construction time, and that object determines the form of the translated expression based on scope. Variables that are local to the scoping object do not need to be translated, whereas variables from other objects must be accessed using the appropriate set() or get() call. Array entry access and function invocation are also translated in a similar way. For example, the expression “a+b[2]-comp.y(x)” for a scoping object that contains variables a and b, but not comp,x or y, would translate to “a+b[2]-self.parent.invoke(‘comp.y’,self.parent.get(‘x’))”.

If lazy_check is False, any objects referenced in the expression must exist at creation time (or any time later that text is set to a different value) or a RuntimeError will be raised. If lazy_check is True, error reporting will be delayed until the expression is evaluated.

If single_name is True, the expression can only be the name of one object, with optional array indexing, but general expressions are not allowed because the expression is intended to be on the LHS of an assignment statement.

check_resolve()[source]

Return True if all variables referenced by our expression can be resolved.

evaluate()[source]

Return the value of the scoped string, evaluated using the eval() function.

get_referenced_compnames()[source]

Return a set of source or dest Component names based on the pathnames of Variables referenced in our reference string.

get_referenced_varpaths()[source]

Return a set of source or dest Variable pathnames relative to self.parent and based on the names of Variables referenced in our reference string.

refs_valid()[source]

Return True if all variables referenced by our expression are valid.

set(val)[source]

Set the value of the referenced object to the specified value.

factory.py

class openmdao.main.factory.Factory[source]

Bases: object

Base class for objects that know how to create other objects based on a type argument and several optional arguments (version, server id, and resource description).

create(typname, version=None, server=None, res_desc=None, **ctor_args)[source]

Return an object of type typename, using the specified package version, server location, and resource description.

get_available_types(groups=None)[source]

Return a set of tuples of the form (typename, dist_version), one for each available plugin type in the given entry point groups. If groups is None, return the set for all openmdao entry point groups.

factorymanager.py

Manages the creation of framework objects, either locally or remotely.

openmdao.main.factorymanager.create(typname, version=None, server=None, res_desc=None, **ctor_args)[source]

Create and return an object specified by the given type, version, etc.

openmdao.main.factorymanager.register_class_factory(fct)[source]

Add a Factory to the factory list.

openmdao.main.factorymanager.get_available_types(groups=None)[source]

Return a set of tuples of the form (typename, dist_version), one for each available plugin type in the given entry point groups. If groups is None, return the set for all openmdao entry point groups.

filevar.py

Support for files, either as File or external files.

class openmdao.main.filevar.FileMetadata(path, **metadata)[source]

Bases: object

Metadata related to a file, specified by keyword arguments (except for ‘path’). By default, the metadata includes:

  • ‘path’, a string, no default value. It may be a glob-style pattern in the case of an external file description. Non-absolute paths are relative to their owning component’s directory.
  • ‘desc’, a string, default null.
  • ‘content_type’, a string, default null.
  • ‘binary’, boolean, default False.
  • ‘big_endian’, boolean, default False. Only meaningful if binary.
  • ‘single_precision’, boolean, default False. Only meaningful if binary.
  • ‘integer_8’, boolean, default False. Only meaningful if binary.
  • ‘unformatted’, boolean, default False. Only meaningful if binary.
  • ‘recordmark_8’, boolean, default False. Only meaningful if unformatted.

In addition, external files have defined behavior for:

  • ‘input’, boolean, default False. If True, the file(s) should exist before execution.
  • ‘output’, boolean, default False. If True, the file(s) should exist after execution.
  • ‘constant’, boolean, default False. If True, the file(s) may be safely symlinked.

Arbitrary additional metadata may be assigned.

get(attr, default)[source]

Return attr value, or default if attr has not been defined.

class openmdao.main.filevar.FileRef(path, owner=None, **metadata)[source]

Bases: openmdao.main.filevar.FileMetadata

A reference to a file on disk. As well as containing metadata information, it supports open() to read the file’s contents. Before open() is called, ‘owner’ must be set to an object supporting check_path() and get_abs_directory() (typically a Component or one of its child Container objects).

copy(owner)[source]

Return a copy of ourselves, owned by owner.

owner: Component
The component used to determine the root for relative paths and checking the legality of absolute paths.
open()[source]

Open file for reading.

hasconstraints.py

class openmdao.main.hasconstraints.Constraint(lhs, comparator, rhs, scope=None)[source]

Bases: object

evaluate()[source]

Returns a tuple of the form (lhs, rhs, comparator, is_violated).

class openmdao.main.hasconstraints.HasConstraints(parent)[source]

Bases: object

Add this class as a delegate if your Driver supports both equality and inequality constraints.

add_constraint(expr_string)[source]

Adds a constraint to the driver.

add_eq_constraint(lhs, rhs)[source]

Adds an equality constraint as two strings, a left hand side and a right hand side.

add_ineq_constraint(lhs, comparator, rhs)[source]

Adds an inequality constraint as three strings; a left hand side, a comparator (‘<’,’>’,’<=’, or ‘>=’), and a right hand side.

clear_constraints()[source]

Removes all constraints.

eval_eq_constraints()[source]

Returns a list of tuples of the form (lhs, rhs, comparator, is_violated) from evalution of equality constraints.

eval_ineq_constraints()[source]

Returns a list of tuples of the form (lhs, rhs, comparator, is_violated) from evalution of inequality constraints.

get_eq_constraints()[source]

Returns an ordered dict of equality constraint objects.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by a constraint.

get_ineq_constraints()[source]

Returns an ordered dict of inequality constraint objects.

list_constraints()[source]

Return a list of strings containing constraint expressions.

remove_constraint(expr_string)[source]

Removes the constraint with the given string.

class openmdao.main.hasconstraints.HasEqConstraints(parent)[source]

Bases: openmdao.main.hasconstraints._HasConstraintsBase

add_constraint(expr_string)[source]

Adds a constraint in the form of a boolean expression string to the driver.

add_eq_constraint(lhs, rhs)[source]

Adds an equality constraint as two strings, a left hand side and a right hand side.

eval_eq_constraints()[source]

Returns a list of tuples of the form (lhs, rhs, comparator, is_violated)

get_eq_constraints()[source]

Returns an ordered dict of constraint objects.

class openmdao.main.hasconstraints.HasIneqConstraints(parent)[source]

Bases: openmdao.main.hasconstraints._HasConstraintsBase

add_constraint(expr_string)[source]

Adds a constraint to the driver

add_ineq_constraint(lhs, rel, rhs)[source]

Adds an inequality constraint as three strings; a left hand side, a comparator (‘<’,’>’,’<=’, or ‘>=’), and a right hand side.

eval_ineq_constraints()[source]

Returns a list of constraint values

get_ineq_constraints()[source]

Returns an ordered dict of inequality constraint objects.

hasevents.py

class openmdao.main.hasevents.HasEvents(parent)[source]

Bases: object

This class provides an implementation of the IHasEvents interface

add_event(name)[source]

Adds an event variable to be set by the driver.

name: string
name of the event variable the driver should set during execution
clear_events()[source]

Remove all event variables from the driver’s list.

get_events()[source]

Return the list of event variables to be set by this driver.

remove_event(name)[source]

Remove the name of the specified event variable from the driver’s list of event variables to be set during execution.

set_events()[source]

Set all events in the event list.

hasobjective.py

class openmdao.main.hasobjective.HasObjective(parent)[source]

Bases: object

This class provides an implementation of the IHasObjective interface.

add_objective(expr)[source]

Sets the objective of this driver to be the specified expression. If there is a preexisting objective in this driver, it is replaced.

expr: string
String containing the objective expression.
eval_objective()[source]

Returns the value of the evaluated objective.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by our objective.

get_objective()[source]

Returns the objective object.

list_objective()[source]

Returns the expression string for the objective.

class openmdao.main.hasobjective.HasObjectives(parent)[source]

Bases: object

This class provides an implementation of the IHasObjectives interface.

add_objective(expr)[source]

Adds an objective to the driver.

expr: string
String containing the objective expression.
add_objectives(obj_iter)[source]

Takes an iterator of objective strings and creates objectives for them in the driver.

clear_objectives()[source]

Removes all objectives.

eval_objectives()[source]

Returns a list of values of the evaluated objectives.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by our objectives.

get_objectives()[source]

Returns an ordered dict of objective objects.

list_objectives()[source]

Returns a list of objective expressions.

remove_objective(expr)[source]

Removes the specified objective expression. Spaces within the expression are ignored.

hasparameters.py

class openmdao.main.hasparameters.HasParameters(parent)[source]

Bases: object

This class provides an implementation of the IHasParameters interface

add_parameter(name, low=None, high=None)[source]

Adds a parameter to the driver.

name: string
Name of the variable the driver should vary during execution.
low: float, optional
Minimum allowed value of the parameter.
high: float, optional
Maximum allowed value of the parameter.

If neither “low” nor “high” is specified, the min and max will default to the values in the metadata of the variable being referenced. If they are not specified in the metadata and not provided as arguments, a ValueError is raised.

add_parameters(param_iter)[source]

Takes an iterator of tuples of the form (param_name, low, high) and adds the parameters to the driver.

clear_parameters()[source]

Removes all parameters.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by a parameter.

get_parameters()[source]

Returns an ordered dict of parameter objects.

list_parameters()[source]

Returns an alphabetized list of parameter names.

remove_parameter(name)[source]

Removes the parameter with the given name.

set_parameters(values)[source]

Pushes the values in the iterator ‘values’ into the corresponding variables in the model.

values: iterator
iterator of input values with an order defined to match the order of parameters returned by the list_parameter method. ‘values’ must support the len() function.
class openmdao.main.hasparameters.Parameter(low=None, high=None, expr=None)[source]

Bases: object

hasstopcond.py

class openmdao.main.hasstopcond.HasStopConditions(parent)[source]

Bases: object

A delegate that adds handling of stop conditions that are supplied as expression strings.

add_stop_condition(exprstr)[source]
clear_stop_conditions()[source]

Removes all stop conditions.

eval_stop_conditions()[source]

Returns a list of evaluated stop conditions.

get_stop_conditions()[source]

Returns an ordered dict of stop condition strings.

remove_stop_condition(expr_string)[source]

Removes the stop condition matching the given string.

should_stop()[source]

Return True if any of the stopping conditions evaluate to True.

importfactory.py

class openmdao.main.importfactory.ImportFactory[source]

Bases: openmdao.main.factory.Factory

Creates objects using the standard Python __import__ mechanism. The object must have a ctor with the same name as the module, minus the file extension. For example, to create a MyComp object, the module must be named MyComp.py (or .pyc or .pyo). This factory does not support specific version creation or creation on a remote server.

create(typ, version=None, server=None, res_desc=None, **ctor_args)[source]

Tries to import the given named module and return a factory function from it. The factory function or constructor must have the same name as the module. The module must be importable in the current Python environment.

get_available_types(groups=None)[source]

Does nothing but is included to adhere to the Factory interface.

interfaces.py

Interfaces for the OpenMDAO project.

class openmdao.main.interfaces.HasObjectives[source]

Bases: object

An Interface for objects having a multiple objectives.

add_objective(expr)[source]

Adds an objective to the driver.

expr: string
String containing the objective expression.
add_objectives(obj_iter)[source]

Takes an iterator of objective strings and creates objectives for them in the driver.

clear_objectives()[source]

Removes all objectives.

eval_objectives()[source]

Returns a list of values of the evaluated objectives.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by our objectives.

get_objectives()[source]

Returns an ordered dict of objective objects.

list_objectives()[source]

Returns a list of objective expression strings.

remove_objective(expr)[source]

Removes the specified objective expression. Spaces within the expression are ignored.

class openmdao.main.interfaces.ICaseIterator[source]

Bases: enthought.traits.has_traits.Interface

An iterator that returns Case objects.

next()[source]

Return the next Case.

class openmdao.main.interfaces.ICaseRecorder[source]

Bases: enthought.traits.has_traits.Interface

A recorder of Cases.

get_iterator()[source]

Return an iterator that matches the format that this recorder uses.

record(case)[source]

Record the given Case.

class openmdao.main.interfaces.IComponent[source]

Bases: enthought.traits.has_traits.Interface

A marker interface for Components.

class openmdao.main.interfaces.IDOEgenerator[source]

Bases: enthought.traits.has_traits.Interface

An iterator that returns arrays of normalized values that are mapped to design variables by a Driver.

Int num_parameters
number of parameters in the DOE
class openmdao.main.interfaces.IDriver[source]

Bases: enthought.traits.has_traits.Interface

A marker interface for Drivers. To make a usable IDriver plug-in, you must still inherit from Driver.

Instance (IWorkflow) workflow

None

  • copy: ‘deep’
iteration_set()[source]

Return a set of names (not pathnames) containing all Components in all of the workflows managed by this Driver

class openmdao.main.interfaces.IFactory[source]

Bases: enthought.traits.has_traits.Interface

An object that creates and returns objects based on a type string.

create(typ)[source]

Create an object of the specified type and return it, or a proxy to it if it resides in another process.

class openmdao.main.interfaces.IHasConstraints[source]

Bases: openmdao.main.interfaces.IHasEqConstraints, openmdao.main.interfaces.IHasIneqConstraints

An Interface for objects containing both equality and inequality constraints.

add_constraint(expr_string)[source]

Adds a constraint as a string containing an assignment or an inequality, e.g., ‘a=b’ or ‘a<=b’.

class openmdao.main.interfaces.IHasEqConstraints[source]

Bases: enthought.traits.has_traits.Interface

An Interface for objects containing equality constraints.

add_constraint(expr_string)[source]

Adds an equality constraint as a string containing an assignment, for example, ‘a = b’.

add_eq_constraint(lhs, rhs)[source]

Adds an equality constraint as two strings, a left hand side and a right hand side.

clear_constraints()[source]

Removes all constraints.

eval_eq_constraints()[source]

Evaluates the constraint expressions and returns a list of tuples of the form (lhs, rhs, relation, is_violated).

get_eq_constraints()[source]

Returns an ordered dict of equality constraint objects.

remove_constraint(expr_string)[source]

Removes the constraint matching the given string. Whitespace is ignored.

class openmdao.main.interfaces.IHasEvents[source]

Bases: enthought.traits.has_traits.Interface

add_event(name)[source]

Adds an event variable to be set when set_events is called.

name: string
name of the event variable that should be set during execution
clear_events()[source]

Remove all event variables from the list.

get_events()[source]

Return the list of event variables to be set.

remove_event(name)[source]

Remove the name of the specified event variable from the list of event variables to be set during execution.

set_events()[source]

Set all events in the event list.

class openmdao.main.interfaces.IHasIneqConstraints[source]

Bases: enthought.traits.has_traits.Interface

An Interface for objects containing inequality constraints.

add_constraint(expr_string)[source]

Adds an inequality constraint as a string containing an inequality, for example, ‘a > b’.

add_ineq_constraint(lhs, rel, rhs)[source]

Adds an inequality constraint as three strings; a left hand side, a right hand side, and a relation. The relation must be one of the following: ‘<’, ‘>’, ‘<=’, or ‘>=’.

clear_constraints()[source]

Removes all constraints.

eval_ineq_constraints()[source]

Evaluates the constraint expressions and returns a list of tuples of the form (lhs, rhs, relation, is_violated).

get_ineq_constraints()[source]

Returns an ordered dict of inequality constraint objects.

remove_constraint(expr_string)[source]

Removes the constraint matching the given string. Whitespace is ignored.

class openmdao.main.interfaces.IHasObjective[source]

Bases: enthought.traits.has_traits.Interface

An Interface for objects having a single objective.

add_objective(expr)[source]

Sets the objective of this driver to be the specified expression. If there is a preexisting objective in this driver, it is replaced.

expr: string
String containing the objective expression.
eval_objective()[source]

Returns the value of the evaluated objective.

get_expr_depends()[source]

Returns a list of tuples of the form (src_comp_name, dest_comp_name) for each dependency introduced by our objective.

get_objective()[source]

Returns the objective object.

list_objective()[source]

Returns the expression string for the objective.

class openmdao.main.interfaces.IHasParameters[source]

Bases: enthought.traits.has_traits.Interface

add_parameter(param_name, low=None, high=None)[source]

Adds a parameter to the driver.

param_name: str
Name of the parameter to add to the driver.
low: number, optional
Minimum allowed value the optimzier can use for this parameter. If not specified, then the ‘low’ value from the variable is used.
high: number, optional
Maximum allowed value the optimizer can use for this parameter. If not specified, then the “high” value from the variable is used.
add_parameters(param_iter)[source]

Takes an iterator of tuples of the form (param_name, low, high) and adds the parameters to the driver.

clear_parameters()[source]

Removes all parameters.

get_parameters()[source]

Returns an ordered dict of parameter objects.

list_parameters()[source]

Lists all the parameters.

remove_parameter(param_name)[source]

Removes the specified parameter. Raises a KeyError if param_name is not found.

param_name: str
the name of the parameter to remove
set_parameters(X)[source]

Pushes the values in the X input array into the corresponding variables in the model.

X: iterator
iterator of input values with an order defined to match the order of parameters returned by the list_parameter method. X must support the len() function.
class openmdao.main.interfaces.IResourceAllocator[source]

Bases: enthought.traits.has_traits.Interface

An object responsible for allocating CPU/disk resources for a particular host, cluster, load balancer, etc.

deploy(name, resource_desc, criteria)[source]

Deploy a server suitable for resource_desc. criteria is the dictionary returned by time_estimate(). Returns a proxy to the deployed server.

list_allocated_components()[source]

Return a list of tuples (hostname, pid, component_name) for each Component currently allocated by this allocator.

max_servers(resource_desc)[source]

Return the maximum number of servers which could be deployed for resource_desc. The value needn’t be exact, but performance may suffer if it overestimates. The value is used to limit the number of concurrent evaluations.

time_estimate(resource_desc)[source]

Return (estimate, criteria) indicating how well this resource allocator can satisfy the resource_desc request. The estimate will be:

  • >0 for an estimate of walltime (seconds).
  • 0 for no estimate.
  • -1 for no resource at this time.
  • -2 for no support for resource_desc.

The returned criteria is a dictionary containing information related to the estimate, such as load averages, unsupported resources, etc.

class openmdao.main.interfaces.ISurrogate[source]

Bases: enthought.traits.has_traits.Interface

get_uncertain_value(value)[source]

converts a deterministic value into an uncertain quantity which matches the uncertain variable type the surrogate predicts

predict(X)[source]

Predicts a value of from the surrogate model, for the given independent values in X.

X: list
the input values for which the predicted output is requested.

Returns the predicted output value

train(X, Y)[source]

Trains the surrogate model, based on the given training data set.

X: iterator of lists
values representing the training case input history
y: iterator
training case output history for this surrogate’s output, which corresponds to the training case input history given by X
class openmdao.main.interfaces.IUncertainVariable[source]

Bases: enthought.traits.has_traits.Interface

A variable which supports uncertainty

expected()[source]

Calculates the expected value of the uncertainty distribution

getvalue()[source]

returns either value from expected() or from sample() depending on the golbal or local uncertainty setting

sample()[source]

Generates a random number from an uncertainty distribution

class openmdao.main.interfaces.IWorkflow[source]

Bases: enthought.traits.has_traits.Interface

An object that can run a group of components in some order.

Instance (IComponent) scope

None

  • copy: ‘deep’
add(comp)[source]

Add the Component to this workflow.

clear()[source]

Remove all Components from this workflow.

contents()[source]

Return a list of all Components in this Workflow. No ordering is assumed.

remove(comp)[source]

Remove the Component from this workflow. Do not raise an exception if the component is not found.

run()[source]

Run the components in the workflow.

step()[source]

Run a single component in the Workflow.

stop()[source]

Stop all components in this workflow. We assume it’s OK to to call stop() on something that isn’t running.

openmdao.main.interfaces.obj_has_interface(obj, *ifaces)[source]

Returns True if the specified object inherits from HasTraits and implements one or more of the specified interfaces.

mp_distributing.py

This module is based on the distributing.py file example which was (temporarily) posted with the multiprocessing module documentation.

class openmdao.main.mp_distributing.Cluster(hostlist, modules)[source]

Bases: openmdao.main.mp_managers.SyncManager

Represents a collection of hosts. Cluster is a subclass of SyncManager, so it allows creation of various types of shared objects.

LoadedObject(*args, **kwds)
LocalAllocator(*args, **kwds)
ObjServer(*args, **kwds)
Process(group=None, target=None, name=None, args=None, kwargs=None)[source]

Return a Process object associated with a host.

file(*args, **kwds)
shutdown()[source]

Shut down all remote managers and then this one.

start()[source]

Start this manager and all remote managers.

class openmdao.main.mp_distributing.Host(hostname, python=None)[source]

Bases: object

Represents a host to use as a node in a cluster. hostname gives the name of the host. python is the path the the Python command to be used on hostname. ssh is used to log in to the host. To log in as a different user use a host name of the form: “username@somewhere.org“.

poll()[source]

Poll for process status.

register(typeid, callable=None, method_to_typeid=None)[source]

Register proxy info to be sent to remote process.

start_manager(index, authkey, address, files)[source]

Launch remote manager process.

openmdao.main.mp_distributing.current_process()

Return process object representing the current process

mp_managers.py

Based on the standard multiprocessing.managers. It contains minor fixes required to support OpenMDAO resource allocation.

class openmdao.main.mp_managers.BaseManager(address=None, authkey=None, serializer='pickle')[source]

Bases: object

Base class for managers.

connect()[source]

Connect manager object to the server process.

get_server()[source]

Return server object with serve_forever() method and address attribute.

join(timeout=None)[source]

Join the manager process (if it has been spawned).

classmethod register(typeid, callable=None, proxytype=None, exposed=None, method_to_typeid=None, create_method=True)[source]

Register a typeid with the manager type.

start()[source]

Spawn a server process for this manager object.

address
class openmdao.main.mp_managers.SyncManager(address=None, authkey=None, serializer='pickle')[source]

Bases: openmdao.main.mp_managers.BaseManager

Subclass of BaseManager which supports a number of shared object types.

The types registered are those intended for the synchronization of threads, plus dict, list and Namespace.

The multiprocessing.Manager() function creates started instances of this class.

Array(*args, **kwds)
BoundedSemaphore(*args, **kwds)
Condition(*args, **kwds)
Event(*args, **kwds)
JoinableQueue(*args, **kwds)
Lock(*args, **kwds)
Namespace(*args, **kwds)
Pool(*args, **kwds)
Queue(*args, **kwds)
RLock(*args, **kwds)
Semaphore(*args, **kwds)
Value(*args, **kwds)
dict(*args, **kwds)
list(*args, **kwds)
class openmdao.main.mp_managers.BaseProxy(token, serializer, manager=None, authkey=None, exposed=None, incref=True)[source]

Bases: object

A base for proxies of shared objects.

class openmdao.main.mp_managers.Token(typeid, address, id)[source]

Bases: object

Type to uniquely indentify a shared object

address
id
typeid

objserverfactory.py

class openmdao.main.objserverfactory.ObjServer(name='', host='')[source]

Bases: object

An object which knows how to load a model. Executes in a subdirectory of the parent factory’s startup directory. All remote file accesses must be within the tree rooted there.

chmod(path, mode)[source]

Returns os.chmod(path, mode) if path is legal.

path: string
Path to file to modify.
mode: int
New mode bits (permissions).
cleanup()[source]

Cleanup this server’s directory.

static echo(*args)[source]

Simply return our arguments.

execute_command(command, stdin, stdout, stderr, env_vars, poll_delay, timeout)[source]

Run command in subprocess.

command: string
Command line to be executed.
stdin, stdout, stderr: string
Filenames for the corresponding stream.
env_vars: dict
Environment variables for the command.
poll_delay: float (seconds)
Delay between polling subprocess for completion.
timeout: float (seconds)
Maximum time to wait for command completion. A value of zero implies no timeout.
get_host()[source]

Return this server’s host.

get_name()[source]

Return this server’s name.

get_pid()[source]

Return this server’s pid.

load_model(egg_filename)[source]

Load model and return top-level object.

egg_filename: string
Filename of egg to be loaded.
open(filename, mode='r', bufsize=-1)[source]

Returns open(filename, mode, bufsize) if filename is legal.

filename: string
Name of file to open.
mode: string
Accees mode.
bufsize: int
Size of buffer to use.
pack_zipfile(patterns, filename)[source]

Create ZipFile of files matching patterns.

patterns: list
List of glob-style patterns.
filename: string
Name of ZipFile to create.
static register(manager)[source]

Register ObjServer proxy info with manager. Not typically called by user code.

manager: Manager
multiprocessing Manager to register with.
stat(path)[source]

Returns os.stat(path) if path is legal.

path: string
Path to file to interrogate.
unpack_zipfile(filename)[source]

Unpack ZipFile filename.

filename: string
Name of ZipFile to unpack.
class openmdao.main.objserverfactory.ObjServerFactory[source]

Bases: openmdao.main.factory.Factory

An ObjServerFactory creates ObjServers which use multiprocessing for communication. Note that multiprocessing is not a transparent distributed object protocol. See the Python documentation for details.

create(typname, version=None, server=None, res_desc=None, **ctor_args)[source]

Create an ObjServer and return a proxy for it.

typname: string
Type of object to create. Currently not used.
version: string or None
Version of typname to create. Currently not used.
server:
Currently not used.
res_desc: dict or None
Required resources. Currently not used.
ctor_args:
Other constructor arguments. If name is specified, that is used as the name of the ObjServer.

pkg_res_factory.py

class openmdao.main.pkg_res_factory.EntryPtLoader(name, group, dist, entry_pt)[source]

Bases: object

Holder of entry points. Will perform lazy importing of distributions as needed.

create(env, **ctor_args)[source]

Return the object created by calling the entry point. If necessary, first activate the distribution and load the entry point, and check for conflicting version dependencies before loading.

class openmdao.main.pkg_res_factory.PkgResourcesFactory(groups, search_path=None)[source]

Bases: openmdao.main.factory.Factory

A Factory that loads plugins using the pkg_resources API, which means it searches through egg info of distributions in order to find any entry point groups corresponding to openmdao plugin types, e.g., openmdao.component, openmdao.variable, etc.

create(typ, version=None, server=None, res_desc=None, **ctor_args)[source]

Create and return an object of the given type, with optional name, version, server id, and resource description.

get_available_types(groups=None)[source]

Return a set of tuples of the form (typename, dist_version), one for each available plugin type in the given entry point groups. If groups is None, return the set for all openmdao entry point groups.

get_loaders(group, active=True)[source]

Return a list of EntryPointLoaders with group ids that match the given group.

update_search_path(search_path)[source]

Updates the plugin search path.

openmdao.main.pkg_res_factory.import_version(modname, req, env=None)[source]

Import the specified module from the package specified in the Requirement req, if it can be found in the current WorkingSet or in the specified Environment. If a conflicting version already exists in the WorkingSet, a VersionConflict will be raised. If a distrib cannot be found matching the requirement, raise a DistributionNotFound.

resource.py

Support for allocation of servers from one or more resources (i.e., the local host, a cluster of remote hosts, etc.)

class openmdao.main.resource.ClusterAllocator(name, machines)[source]

Bases: object

Cluster-based resource allocator. This allocator manages a collection of LocalAllocator, one for each machine in the cluster. machines is a list of dictionaries providing configuration data for each machine in the cluster. At a minimum, each dictionary must specify a host address in ‘hostname’ and the path to the OpenMDAO python command in ‘python’.

We assume that machines in the cluster are similar enough that ranking by load average is reasonable.

deploy(name, resource_desc, criteria)[source]

Deploy a server suitable for resource_desc. Uses the allocator saved in criteria. Returns a proxy to the deployed server.

name: string
Name for server.
resource_desc: dict
Description of required resources.
criteria: dict
The dictionary returned by time_estimate().
max_servers(resource_desc)[source]

Returns the total of max_servers() across all LocalAllocator in the cluster.

resource_desc: dict
Description of required resources.
shutdown()[source]

Shutdown, releasing resources.

time_estimate(resource_desc)[source]

Returns (estimate, criteria) indicating how well this allocator can satisfy the resource_desc request. The estimate will be:

  • >0 for an estimate of walltime (seconds).
  • 0 for no estimate.
  • -1 for no resource at this time.
  • -2 for no support for resource_desc.

The returned criteria is a dictionary containing information related to the estimate, such as hostnames, load averages, unsupported resources, etc.

This allocator polls each LocalAllocator in the cluster to find the best match and returns that. The best allocator is saved in the returned criteria for a subsequent deploy().

resource_desc: dict
Description of required resources.
class openmdao.main.resource.LocalAllocator(name='LocalAllocator', total_cpus=0, max_load=1.0)[source]

Bases: openmdao.main.resource.ResourceAllocator

Purely local resource allocator. If total_cpus is >0, then that is taken as the number of cpus/cores available. Otherwise the number is taken from multiprocessing.cpu_count(). The max_load parameter specifies the maximum cpu-adjusted load allowed when determining if another server may be started in time_estimate().

deploy(name, resource_desc, criteria)[source]

Deploy a server suitable for resource_desc. Returns a proxy to the deployed server.

name: string
Name for server.
resource_desc: dict
Description of required resources.
criteria: dict
The dictionary returned by time_estimate().
max_servers(resource_desc)[source]

Returns total_cpus * max_load if resource_desc is supported, otherwise zero.

resource_desc: dict
Description of required resources.
static register(manager)[source]

Register LocalAllocator proxy info with manager. Not typically called by user code.

manager: Manager
multiprocessing Manager to register with.
time_estimate(resource_desc)[source]

Returns (estimate, criteria) indicating how well this allocator can satisfy the resource_desc request. The estimate will be:

  • >0 for an estimate of walltime (seconds).
  • 0 for no estimate.
  • -1 for no resource at this time.
  • -2 for no support for resource_desc.

The returned criteria is a dictionary containing information related to the estimate, such as hostnames, load averages, unsupported resources, etc.

resource_desc: dict
Description of required resources.
class openmdao.main.resource.ResourceAllocationManager[source]

Bases: object

The allocation manager maintains a list of ResourceAllocator which are used to select the “best fit” for a particular resource request. The manager is initialized with a LocalAllocator for the local host. Additional allocators can be added and the manager will look for the best fit across all the allocators.

static add_allocator(allocator)[source]

Add an allocator to the list of resource allocators.

allocator: ResourceAllocator
The allocator to be added.
static allocate(resource_desc)[source]

Determine best resource for resource_desc and deploy. In the case of a tie, the first allocator in the allocators list wins. Returns (proxy-object, server-dict).

resource_desc: dict
Description of required resources.
static get_allocator(index)[source]

Return allocator at index.

index: int
List index for allocator to be returned.
static get_hostnames(resource_desc)[source]

Determine best resource for resource_desc and return hostnames. In the case of a tie, the first allocator in the allocators list wins. Typically used by parallel code wrappers which have MPI or something similar for process deployment.

resource_desc: dict
Description of required resources.
static get_instance()[source]

Return singleton instance.

static insert_allocator(index, allocator)[source]

Insert an allocator into the list of resource allocators.

index: int
List index for the insertion point.
allocator: ResourceAllocator
The allocator to be inserted.
static max_servers(resource_desc)[source]

Returns the maximum number of servers compatible with ‘resource_desc`. This should be considered an upper limit on the number of concurrent allocations attempted.

resource_desc: dict
Description of required resources.
static release(server)[source]

Release a server (proxy).

server: multiprocessing proxy
Server to be released.
class openmdao.main.resource.ResourceAllocator(name)[source]

Bases: openmdao.main.objserverfactory.ObjServerFactory

Base class for allocators. Allocators estimate the suitability of a resource and can deploy on that resource.

check_orphan_modules(resource_value)[source]

Returns True if this allocator can support the specified ‘orphan’ modules.

resource_value: list
List of ‘orphan’ module names.
check_required_distributions(resource_value)[source]

Returns True if this allocator can support the specified required distributions.

resource_value: list
List of Distributions.
deploy(name, resource_desc, criteria)[source]

Deploy a server suitable for resource_desc. Returns a proxy to the deployed server.

name: string
Name for server.
resource_desc: dict
Description of required resources.
criteria: dict
The dictionary returned by time_estimate().
get_name()[source]

Returns this allocator’s name.

max_servers(resource_desc)[source]

Return the maximum number of servers which could be deployed for resource_desc. The value needn’t be exact, but performance may suffer if it overestimates. The value is used to limit the number of concurrent evaluations.

resource_desc: dict
Description of required resources.
time_estimate(resource_desc)[source]

Return (estimate, criteria) indicating how well this resource allocator can satisfy the resource_desc request. The estimate will be:

  • >0 for an estimate of walltime (seconds).
  • 0 for no estimate.
  • -1 for no resource at this time.
  • -2 for no support for resource_desc.

The returned criteria is a dictionary containing information related to the estimate, such as hostnames, load averages, unsupported resources, etc.

resource_desc: dict
Description of required resources.

seqentialflow.py

class openmdao.main.seqentialflow.SequentialWorkflow(members=None)[source]

Bases: openmdao.main.workflow.Workflow

A Workflow that is a simple sequence of components.

add(comp)[source]

Add new component(s) to the end of the workflow.

clear()[source]

Remove all components from this workflow.

contents()[source]

Returns a list of all components in the workflow.

remove(comp)[source]

Remove a component from the workflow. Do not report an error if the specified component is not found.

tvalwrapper.py

class openmdao.main.tvalwrapper.TraitValMetaWrapper(value=None, **metadata)[source]

Bases: object

A class that encapsulates a trait value and any metadata necessary for validation of that trait. For example, a TraitValMetaWrapper for a Float object would include ‘units’ metadata to allow for unit compatability checking and conversion.

uncertain_distributions.py

class openmdao.main.uncertain_distributions.NormalDistribution(mu=0.0, sigma=1.0)[source]

Bases: openmdao.main.uncertain_distributions.UncertainDistribution

An UncertainDistribution which represents a quantity with a normal distribution of uncertainty.

mu: float
mean value
sigma: float
standard deviation
expected()[source]
sample()[source]
class openmdao.main.uncertain_distributions.TriangularDistribution(max=0.0, min=1.0, mode=0.5, *args, **kwargs)[source]

Bases: openmdao.main.uncertain_distributions.UncertainDistribution

An UncertainDistribution which represents a quantity with a triangular distribution of uncertainty.

min: float
minimum value
max: float
maximum value
mode: float
mode
expected()[source]
sample()[source]
class openmdao.main.uncertain_distributions.UncertainDistribution(valmethod=None)[source]

Bases: object

Base class for uncertain variables.

expected()[source]
getvalue()[source]
sample()[source]
class openmdao.main.uncertain_distributions.UniformDistribution(max=0.0, min=1.0, *args, **kwargs)[source]

Bases: openmdao.main.uncertain_distributions.UncertainDistribution

An UncertainDistribution which represents a quantity with a uniform distribution of uncertainty.

min: float
minimum value
max: float
maximum value
expected()[source]
sample()[source]
class openmdao.main.uncertain_distributions.WeibullDistribution(alpha=1.0, beta=2.0, *args, **kwargs)[source]

Bases: openmdao.main.uncertain_distributions.UncertainDistribution

An UncertainDistribution which represents a quantity with a weibull distribution of uncertainty.

alpha: float
scale parameter
beta: float
shape parameter
expected()[source]
sample()[source]

workflow.py

Workflow class definition

class openmdao.main.workflow.Workflow(members=None)[source]

Bases: object

A Workflow consists of a collection of Components which are to be executed in some order.

add(comp)[source]

Add a new component to the workflow.

config_changed()[source]

Notifies the Workflow that workflow configuration (dependencies, etc) has changed.

contents()[source]

Returns a list of all components in the workflow. No ordering is assumed.

remove(comp)[source]

Remove a component from this Workflow

run()[source]

Run through the nodes in the workflow list.

step()[source]

Run a single component in the Workflow.

stop()[source]

Stop all nodes. We assume it’s OK to to call stop() on something that isn’t running.