MetaModelStructuredComp

MetaModelStructuredComp is a smooth interpolation Component for data that exists on a regular, structured, grid. This differs from MetaModelUnStructured which accepts unstructured data as collections of points.

Note

OpenMDAO contains two components that perform interpolation: SplineComp and MetaModelStructuredComp. While they provide access to mostly the same algorithms, their usage is subtly different. The fundamental differences between them are as follows:

MetaModelStructuredComp is used when you have a set of known data values y on a structured grid x and want to interpolate a new y value at a new x location that lies inside the grid. In this case, you generally start with a known set of fixed “training” values and their locations.

SplineComp is used when you want to create a smooth curve with a large number of points, but you want to control the shape of the curve with a small number of control points. The x locations of the interpolated points (and where applicable, the control points) are fixed and known, but the y values at the control points vary as the curve shape is modified by an upstream connection.

MetaModelStructuredComp can be used for multi-dimensional design spaces, whereas SplineComp is restricted to one dimension.

MetaModelStructuredComp produces smooth fits through provided training data using polynomial splines of various orders. The interpolation methods include three that wrap methods in scipy.interpolate, as well as five methods that are written in pure python. For all methods, derivatives are automatically computed. The following table summarizes the methods and gives the number of points required for each.

Method

Order

Description

slinear

1

Basic linear interpolation

lagrange2

2

Second order Lagrange polynomial

lagrange3

3

Third order Lagrange polynomial

akima

3

Interpolation using Akima splines

cubic

3

Cubic spline, with continuity of derivatives between segments

scipy_slinear

1

Scipy linear interpolation. Same as slinear, though slower

scipy_cubic

3

Scipy cubic interpolation. More accurate than cubic, but slower

scipy_quintic

5

Scipy quintic interpolation. Most accurate, but slowest

Note that MetaModelStructuredComp only accepts scalar inputs and outputs. If you have a multivariable function, each input variable needs its own named OpenMDAO input.

For multi-dimensional data, fits are computed on a separable per-axis basis. A single interpolation method is used for all dimensions, so the minimum table dimension must be high enough to use the chosen interpolate. However, if you choose one of the scipy methods, then automatic order reduction is supported. In this case, if a particular dimension does not have enough training data points to support a selected spline order (e.g. 3 sample points, but an order 5 ‘scipy_quintic’ spline is specified), then the order of the fitted spline will be automatically reduced to one of the lower order scipy methods (‘scipy_cubic’ or ‘scipy_slinear’) for that dimension alone.

Extrapolation is supported, but disabled by default. It can be enabled via the extrapolate option (see below).

MetaModelStructuredComp Options

MetaModelStructuredComp Constructor

The call signature for the MetaModelStructuredComp constructor is:

MetaModelStructuredComp.__init__(**kwargs)[source]

Initialize all attributes.

Parameters
**kwargsdict of keyword arguments

Keyword arguments that will be mapped into the Component options.

MetaModelStructuredComp Examples

A simple quick-start example is fitting the exclusive-or (“XOR”) operator between two inputs, x and y:

import numpy as np
import openmdao.api as om

# Create regular grid interpolator instance
xor_interp = om.MetaModelStructuredComp(method='scipy_slinear')

# set up inputs and outputs
xor_interp.add_input('x', 0.0, training_data=np.array([0.0, 1.0]), units=None)
xor_interp.add_input('y', 1.0, training_data=np.array([0.0, 1.0]), units=None)


xor_interp.add_output('xor', 1.0, training_data=np.array([[0.0, 1.0], [1.0, 0.0]]), units=None)

# Set up the OpenMDAO model
model = om.Group()
model.add_subsystem('comp', xor_interp, promotes=["*"])
prob = om.Problem(model)
prob.setup()

prob.set_val('x', 0)

# Now test out a 'fuzzy' XOR
prob.set_val('x', 0.9)
prob.set_val('y', 0.001242)

prob.run_model()

print(prob.get_val('xor'))

# we can verify all gradients by checking against finite-difference
prob.check_partials(compact_print=True);
[0.8990064]

An important consideration for multi-dimensional input is that the order in which the input variables are added sets the expected dimension of the output training data. For example, if inputs x, y and z are added to the component with training data array lengths of 5, 12, and 20 respectively, and are added in x, y, and z order, than the output training data must be an ndarray with shape (5, 12, 20).

This is illustrated by the example:

# create input param training data, of sizes 25, 5, and 10 points resp.
p1 = np.linspace(0, 100, 25)
p2 = np.linspace(-10, 10, 5)
p3 = np.linspace(0, 1, 10)

# can use meshgrid to create a 3D array of test data
P1, P2, P3 = np.meshgrid(p1, p2, p3, indexing='ij')
f = np.sqrt(P1) + P2 * P3

# verify the shape matches the order and size of the input params
print(f.shape)
(25, 5, 10)
# Create regular grid interpolator instance
interp = om.MetaModelStructuredComp(method='scipy_cubic')
interp.add_input('p1', 0.5, training_data=p1)
interp.add_input('p2', 0.0, training_data=p2)
interp.add_input('p3', 3.14, training_data=p3)

interp.add_output('f', 0.0, training_data=f)

# Set up the OpenMDAO model
model = om.Group()
model.add_subsystem('comp', interp, promotes=["*"])
prob = om.Problem(model)
prob.setup()

# set inputs
prob.set_val('p1', 55.12)
prob.set_val('p2', -2.14)
prob.set_val('p3', 0.323)

prob.run_model()

print(prob.get_val('f'))

# we can verify all gradients by checking against finite-difference
prob.check_partials(compact_print=True);
[6.73306472]

You can also predict multiple independent output points by setting the vec_size argument to be equal to the number of points you want to predict. Here, we set it to 2 and predict 2 points with MetaModelStructuredComp:

# create input param training data, of sizes 25, 5, and 10 points resp.
p1 = np.linspace(0, 100, 25)
p2 = np.linspace(-10, 10, 5)
p3 = np.linspace(0, 1, 10)

# can use meshgrid to create a 3D array of test data
P1, P2, P3 = np.meshgrid(p1, p2, p3, indexing='ij')
f = np.sqrt(P1) + P2 * P3

# Create regular grid interpolator instance
interp = om.MetaModelStructuredComp(method='scipy_cubic', vec_size=2)
interp.add_input('p1', 0.5, training_data=p1)
interp.add_input('p2', 0.0, training_data=p2)
interp.add_input('p3', 3.14, training_data=p3)

interp.add_output('f', 0.0, training_data=f)

# Set up the OpenMDAO model
model = om.Group()
model.add_subsystem('comp', interp, promotes=["*"])
prob = om.Problem(model)
prob.setup()

# set inputs
prob.set_val('p1', np.array([55.12, 12.0]))
prob.set_val('p2', np.array([-2.14, 3.5]))
prob.set_val('p3', np.array([0.323, 0.5]))

prob.run_model()

print(prob.get_val('f'))
[6.73306472 5.21186454]

Finally, it is possible to compute gradients with respect to the given output training data. These gradients are not computed by default, but can be enabled by setting the option training_data_gradients to True. When this is done, for each output that is added to the component, a corresponding input is added to the component with the same name but with an _train suffix. This allows you to connect in the training data as an input array, if desired.

The following example shows the use of training data gradients. This is the same example problem as above, but note training_data_gradients has been set to True. This automatically creates an input named f_train when the output f was added. The gradient of f with respect to f_train is also seen to match the finite difference estimate in the check_partials output.

# create input param training data, of sizes 25, 5, and 10 points resp.
p1 = np.linspace(0, 100, 25)
p2 = np.linspace(-10, 10, 5)
p3 = np.linspace(0, 1, 10)

# can use meshgrid to create a 3D array of test data
P1, P2, P3 = np.meshgrid(p1, p2, p3, indexing='ij')
f = np.sqrt(P1) + P2 * P3

# verify the shape matches the order and size of the input params
print(f.shape)
(25, 5, 10)
# Create regular grid interpolator instance
interp = om.MetaModelStructuredComp(method='scipy_cubic', training_data_gradients=True)
interp.add_input('p1', 0.5, p1)
interp.add_input('p2', 0.0, p2)
interp.add_input('p3', 3.14, p3)

interp.add_output('f', 0.0, f)

# Set up the OpenMDAO model
model = om.Group()
model.add_subsystem('comp', interp, promotes=["*"])
prob = om.Problem(model)
prob.setup()

# set inputs
prob.set_val('p1', 55.12)
prob.set_val('p2', -2.14)
prob.set_val('p3', 0.323)

prob.run_model()

print(prob.get_val('f'))
# we can verify all gradients by checking against finite-difference
prob.check_partials(compact_print=True);
[6.73306472]

Standalone Interface for Table Interpolation

The underlying interpolation algorithms can be used standalone (i.e., outside of the MetaModelStructuredComp) through the InterpND class. This can be useful for inclusion in another component. The following component shows how to perform interpolation on the same table as in the previous example using standalone code. This time, we choose ‘lagrange3’ as the interpolation algorithm.

from openmdao.components.interp_util.interp import InterpND

# create input param training data, of sizes 25, 5, and 10 points resp.
p1 = np.linspace(0, 100, 25)
p2 = np.linspace(-10, 10, 5)
p3 = np.linspace(0, 1, 10)

# can use meshgrid to create a 3D array of test data
P1, P2, P3 = np.meshgrid(p1, p2, p3, indexing='ij')
f = np.sqrt(P1) + P2 * P3

interp = InterpND(method='lagrange3', points=(p1, p2, p3), values=f)

x = np.array([55.12, -2.14, 0.323])
f, df_dx = interp.interpolate(x, compute_derivative=True)

print(f)
print(df_dx)
[6.73306794]
[[ 0.06734927  0.323      -2.14      ]]