The preCICE python bindings

The pyprecice package

precice

The python module precice offers python language bindings to the C++ coupling library precice. Please refer to precice.org for further information.

class cyprecice.Participant

Main Application Programming Interface of preCICE. To adapt a solver to preCICE, follow the following main structure:

  • Create an object of Participant with Participant()

  • Initialize preCICE with Participant::initialize()

  • Advance to the next (time)step with Participant::advance()

  • Finalize preCICE with Participant::finalize()

  • We use solver, simulation code, and participant as synonyms.

  • The preferred name in the documentation is participant.

__init__()

Constructor of Participant class.

Parameters

solver_namestring

Name of the solver

configuration_file_namestring

Name of the preCICE config file

solver_process_indexint

Rank of the process

solver_process_sizeint

Size of the process

communicator: mpi4py.MPI.Intracomm, optional

Custom MPI communicator to use

Returns

Participantobject

Object pointing to the defined participant

Example

>>> participant = precice.Participant("SolverOne", "precice-config.xml", 0, 1)
preCICE: This is preCICE version X.X.X
preCICE: Revision info: vX.X.X-X-XXXXXXXXX
preCICE: Configuring preCICE with configuration: "precice-config.xml"
classmethod __new__(*args, **kwargs)
advance(computed_timestep_length)

Advances preCICE after the solver has computed one timestep.

Parameters

computed_timestep_lengthdouble

Length of timestep used by the solver.

Notes

Previous calls:

initialize() has been called successfully. The solver has computed one timestep. The solver has written all coupling data. finalize() has not yet been called.

Tasks completed:

Coupling data values specified in the configuration are exchanged. Coupling scheme state (computed time, computed timesteps, …) is updated. The coupling state is logged. Configured data mapping schemes are applied. [Second Participant] Configured post processing schemes are applied. Meshes with data are exported to files if configured.

finalize()

Finalizes preCICE.

Notes

Previous calls:

initialize() has been called successfully.

Tasks completed:

Communication channels are closed. Meshes and data are deallocated.

get_data_dimensions(mesh_name, data_name)

Returns the spatial dimensionality of the given data on the given mesh.

Parameters

mesh_namestring

Name of the mesh.

data_namestring

Name of the data.

Returns

dimensionint

The dimensions of the given data.

get_max_time_step_size()

Get the maximum allowed time step size of the current window.

Allows the user to query the maximum allowed time step size in the current window. This should be used to compute the actual time step that the solver uses.

Returns

tagdouble

Maximum size of time step to be computed by solver.

Notes

Previous calls:

initialize() has been called successfully.

get_mesh_dimensions(mesh_name)

Returns the spatial dimensionality of the given mesh.

Parameters

mesh_namestring

Name of the mesh.

Returns

dimensionint

The dimensions of the given mesh.

get_mesh_vertex_ids_and_coordinates(mesh_name)

Iterating over the region of interest defined by bounding boxes and reading the corresponding coordinates omitting the mapping. This function is still experimental.

Parameters

mesh_namestr

Corresponding mesh name

Returns

idsnumpy.ndarray

Vertex IDs corresponding to the coordinates

coordinatesnumpy.ndarray

he coordinates associated to the IDs and corresponding data values (dim * size)

get_mesh_vertex_size(mesh_name)

Returns the number of vertices of a mesh

Parameters

mesh_namestr

Name of the mesh.

Returns

sumint

Number of vertices of the mesh.

initialize()

Fully initializes preCICE and initializes coupling data. The starting values for coupling data are zero by default. To provide custom values, first set the data using the Data Access methods before calling this method to finally exchange the data.

This function handles:

  • Parallel communication to the coupling partner/s is setup.

  • Meshes are exchanged between coupling partners and the parallel partitions are created.

  • [Serial Coupling Scheme] If the solver is not starting the simulation, coupling data is received from the coupling partner’s first computation.

Returns

max_timestepdouble

Maximum length of first timestep to be computed by the solver.

is_coupling_ongoing()

Checks if the coupled simulation is still ongoing. A coupling is ongoing as long as

  • the maximum number of timesteps has not been reached, and

  • the final time has not been reached.

The user should call finalize() after this function returns false.

Returns

tagbool

Whether the coupling is ongoing.

Notes

Previous calls:

initialize() has been called successfully.

is_time_window_complete()

Checks if the current coupling timewindow is completed. The following reasons require several solver time steps per coupling time step:

  • A solver chooses to perform subcycling.

  • An implicit coupling timestep iteration is not yet converged.

Returns

tagbool

Whether the timestep is complete.

Notes

Previous calls:

initialize() has been called successfully.

map_and_read_data(mesh_name, data_name, coordinates, relative_read_time)

This function reads values at temporary locations from data of a mesh. As opposed to the readData function using VertexIDs, this function allows reading data via coordinates, which don’t have to be specified during the initialization. This is particularly useful for meshes, which vary over time. Note that using this function comes at a performance cost, since the specified mapping needs to be computed locally for the given locations, whereas the other variant (readData) can typically exploit the static interface mesh and pre-compute data structures more efficient.

Values are read identically to read_data.

Parameters

mesh_namestr

Name of the mesh to write to.

data_namestr

Name of the data to read from.

coordinatesarray_like

Coordinates of the vertices.

relative_read_timedouble

Point in time where data is read relative to the beginning of the current time step

Returns

valuesnumpy.ndarray

Contains the read data.

Examples

Read scalar data for a 2D problem with 2 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> coordinates = np.array([[1.0, 1.0], [2.0, 2.0]])
>>> dt = 1.0
>>> values = map_and_read_data(mesh_name, data_name, coordinates, dt)
>>> values.shape
>>> (2, )

Read scalar data for a 2D problem with 2 vertices, where the coordinates are provided as a list of tuples:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> coordinates = [(1.0, 1.0), (2.0, 2.0)]
>>> dt = 1.0
>>> values = map_and_read_data(mesh_name, data_name, coordinates, dt)
>>> values.shape
>>> (2, )
read_data(mesh_name, data_name, vertex_ids, relative_read_time)

Reads data into a provided block. This function reads values of specified vertices from a dataID. Values are read into a block of continuous memory.

Parameters

mesh_namestr

Name of the mesh to write to.

data_namestr

Name of the data to read from.

vertex_idsarray_like

Indices of the vertices.

relative_read_timedouble

Point in time where data is read relative to the beginning of the current time step

Returns

valuesnumpy.ndarray

Contains the read data.

Notes

Previous calls:

count of available elements at values matches the configured dimension * size count of available elements at vertex_ids matches the given size initialize() has been called

Examples

Read scalar data for a 2D problem with 5 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> dt = 1.0
>>> values = read_data(mesh_name, data_name, vertex_ids, dt)
>>> values.shape
>>> (5, )

Read vector data for a 2D problem with 5 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> dt = 1.0
>>> values = read_data(mesh_name, data_name, vertex_ids, dt)
>>> values.shape
>>> (5, 2)

Read vector data for a 3D system with 5 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> dt = 1.0
>>> values = read_data(mesh_name, data_name, vertex_ids, dt)
>>> values.shape
>>> (5, 3)
requires_gradient_data_for(mesh_name, data_name)

Checks if the given data set requires gradient data. We check if the data object has been initialized with the gradient flag.

Parameters

mesh_namestr

Mesh name to check.

data_namestr

Data name to check.

Returns

bool

True if gradient data is required for a data.

Examples

Check if gradient data is required for a data:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> participant.is_gradient_data_required(mesh_name, data_name)
requires_initial_data()

Checks if the participant is required to provide initial data. If true, then the participant needs to write initial data to defined vertices prior to calling initialize().

Returns

tagbool

Returns True if initial data is required.

Notes

Previous calls:

initialize() has not yet been called

requires_mesh_connectivity_for(mesh_name)

Checks if the given mesh requires connectivity.

Parameters

mesh_namestring

Name of the mesh.

Returns

tagbool

True if mesh connectivity is required.

requires_reading_checkpoint()

Checks if the participant is required to read an iteration checkpoint.

If true, the participant is required to read an iteration checkpoint before calling advance().

preCICE refuses to proceed if reading a checkpoint is required, but this method isn’t called prior to advance().

Notes

This function returns false before the first call to advance().

Previous calls:

initialize() has been called

requires_writing_checkpoint()

Checks if the participant is required to write an iteration checkpoint.

If true, the participant is required to write an iteration checkpoint before calling advance().

preCICE refuses to proceed if writing a checkpoint is required, but this method isn’t called prior to advance().

Notes

Previous calls:

initialize() has been called

reset_mesh(mesh_name)

Resets a mesh and allows setting it using set_mesh functions again.

Parameters

mesh_namestr

Name of the mesh to reset.

Notes

This function is still experimental. Please refer to the documentation on how to enable and use it.

Previous calls:

advance() has been called

Examples

Reset a mesh with 5 vertices to have 3 vertices.

>>> positions = np.array([[1, 1], [2, 2], [3, 3], [4, 4], [5, 5]])
>>> mesh_name = "MeshOne"
>>> vertex_ids = participant.set_mesh_vertices(mesh_name, positions)
>>> # later in the coupling loop
>>> if remeshing_required():
>>>     participant.reset_mesh(mesh_name)
>>>     positions = np.array([[1, 1], [3, 3], [5, 5]])
>>>     vertex_ids = participant.set_mesh_vertices(mesh_name, positions)
set_mesh_access_region(mesh_name, bounding_box)

This function is required if you don’t want to use the mapping schemes in preCICE, but rather want to use your own solver for data mapping. As opposed to the usual preCICE mapping, only a single mesh (from the other participant) is now involved in this situation since an ‘own’ mesh defined by the participant itself is not required any more. In order to re-partition the received mesh, the participant needs to define the mesh region it wants read data from and write data to. The mesh region is specified through an axis-aligned bounding box given by the lower and upper [min and max] bounding-box limits in each space dimension [x, y, z]. This function is still experimental

Parameters

mesh_namestr

Name of the mesh you want to access through the bounding box

bounding_boxarray_like

Axis aligned bounding box. Example for 3D the format: [x_min, x_max, y_min, y_max, z_min, z_max]

Notes

Defining a bounding box for serial runs of the solver (not to be confused with serial coupling mode) is valid. However, a warning is raised in case vertices are filtered out completely on the receiving side, since the associated data values of the filtered vertices are filled with zero data.

This function can only be called once per participant and rank and trying to call it more than once results in an error.

If you combine the direct access with a mapping (say you want to read data from a defined mesh, as usual, but you want to directly access and write data on a received mesh without a mapping) you may not need this function at all since the region of interest is already defined through the defined mesh used for data reading. This is the case if you define any mapping involving the directly accessed mesh on the receiving participant. (In parallel, only the cases read-consistent and write-conservative are relevant, as usual).

The safety factor scaling (see safety-factor in the configuration file) is not applied to the defined access region and a specified safety will be ignored in case there is no additional mapping involved. However, in case a mapping is in addition to the direct access involved, you will receive (and gain access to) vertices inside the defined access region plus vertices inside the safety factor region resulting from the mapping. The default value of the safety factor is 0.5, i.e. the defined access region as computed through the involved provided mesh is by 50% enlarged.

set_mesh_edge(mesh_name, first_vertex_id, second_vertex_id)

Sets mesh edge from vertex IDs, returns edge ID.

Parameters

mesh_namestr

Name of the mesh to add the edge to.

first_vertex_idint

ID of the first vertex of the edge.

second_vertex_idint

ID of the second vertex of the edge.

Returns

edge_idint

ID of the edge.

Notes

Previous calls:

vertices with firstVertexID and secondVertexID were added to the mesh with name mesh_name

set_mesh_edges(mesh_name, vertices)

Creates multiple mesh edges

Parameters

mesh_namestr

Name of the mesh to add the vertices to.

verticesarray_like

The IDs of the vertices in a numpy array [N x 2] where N = number of edges and D = dimensions of geometry. A list of the same shape is also accepted.

Examples

Set mesh edges for a problem with 4 mesh vertices in the form of a square with both diagonals which are fully interconnected.

>>> vertices = np.array([[1, 2], [1, 3], [1, 4], [2, 3], [2, 4], [3, 4]])
>>> vertices.shape
(6, 2)
>>> participant.set_mesh_edges(mesh_name, vertices)
set_mesh_quad(mesh_name, first_vertex_id, second_vertex_id, third_vertex_id, fourth_vertex_id)

Set a mesh Quad from vertex IDs.

Parameters

mesh_namestr

Name of the mesh to add the quad to.

first_vertex_idint

ID of the first vertex of the quad.

second_vertex_idint

ID of the second vertex of the quad.

third_vertex_idint

ID of the third vertex of the quad.

fourth_vertex_idint

ID of the third vertex of the quad.

Notes

Previous calls:

vertices with first_vertex_id, second_vertex_id, third_vertex_id, and fourth_vertex_id were added to the mesh with the name mesh_name

set_mesh_quads(mesh_name, vertices)

Creates multiple mesh quads

Parameters

mesh_namestr

Name of the mesh to add the quads to.

verticesarray_like

The IDs of the vertices in a numpy array [N x 4] where N = number of quads and D = dimensions of geometry. A list of the same shape is also accepted.

Examples

Set mesh quads for a problem with 4 mesh vertices in the form of a square with both diagonals which are fully interconnected.

>>> vertices = np.array([[1, 2, 3, 4]])
>>> vertices.shape
(1, 2)
>>> participant.set_mesh_quads(mesh_name, vertices)
set_mesh_tetrahedra(mesh_name, vertices)

Creates multiple mesh tetdrahedrons

Parameters

mesh_namestr

Name of the mesh to add the tetrahedrons to.

verticesarray_like

The IDs of the vertices in a numpy array [N x 4] where N = number of quads and D = dimensions of geometry. A list of the same shape is also accepted.

Examples

Set mesh tetrahedrons for a problem with 4 mesh vertices.

>>> vertices = np.array([[1, 2, 3, 4]])
>>> vertices.shape
(1, 2)
>>> participant.set_mesh_tetradehra(mesh_name, vertices)
set_mesh_tetrahedron(mesh_name, first_vertex_id, second_vertex_id, third_vertex_id, fourth_vertex_id)

Sets a mesh tetrahedron from vertex IDs.

Parameters

mesh_namestr

Name of the mesh to add the tetrahedron to.

first_vertex_idint

ID of the first vertex of the tetrahedron.

second_vertex_idint

ID of the second vertex of the tetrahedron.

third_vertex_idint

ID of the third vertex of the tetrahedron.

fourth_vertex_idint

ID of the third vertex of the tetrahedron.

Notes

Previous calls:

vertices with first_vertex_id, second_vertex_id, third_vertex_id, and fourth_vertex_id were added to the mesh with the name mesh_name

set_mesh_triangle(mesh_name, first_vertex_id, second_vertex_id, third_vertex_id)

Set a mesh triangle from edge IDs

Parameters

mesh_namestr

Name of the mesh to add the triangle to.

first_vertex_idint

ID of the first vertex of the triangle.

second_vertex_idint

ID of the second vertex of the triangle.

third_vertex_idint

ID of the third vertex of the triangle.

Notes

Previous calls:

vertices with first_vertex_id, second_vertex_id, and third_vertex_id were added to the mesh with the name mesh_name

set_mesh_triangles(mesh_name, vertices)

Creates multiple mesh triangles

Parameters

mesh_namestr

Name of the mesh to add the triangles to.

verticesarray_like

The IDs of the vertices in a numpy array [N x 3] where N = number of triangles and D = dimensions of geometry. A list of the same shape is also accepted.

Examples

Set mesh triangles for a problem with 4 mesh vertices in the form of a square with both diagonals which are fully interconnected.

>>> vertices = np.array([[1, 2, 3], [1, 3, 4], [1, 2, 4], [1, 3, 4]])
>>> vertices.shape
(4, 2)
>>> participant.set_mesh_triangles(mesh_name, vertices)
set_mesh_vertex(mesh_name, position)

Creates a mesh vertex

Parameters

mesh_namestr

Name of the mesh to add the vertex to.

positionarray_like

The coordinates of the vertex.

Returns

vertex_idint

ID of the vertex which is set.

Notes

Previous calls:

Count of available elements at position matches the configured dimension

set_mesh_vertices(mesh_name, positions)

Creates multiple mesh vertices

Parameters

mesh_namestr

Name of the mesh to add the vertices to.

positionsarray_like

The coordinates of the vertices in a numpy array [N x D] where N = number of vertices and D = dimensions of geometry. A list of the same shape is also accepted.

Returns

vertex_idsnumpy.ndarray

IDs of the created vertices.

Notes

Previous calls:

initialize() has not yet been called count of available elements at positions matches the configured dimension * size count of available elements at ids matches size

Examples

Set mesh vertices for a 2D problem with 5 mesh vertices.

>>> positions = np.array([[1, 1], [2, 2], [3, 3], [4, 4], [5, 5]])
>>> positions.shape
(5, 2)
>>> mesh_name = "MeshOne"
>>> vertex_ids = participant.set_mesh_vertices(mesh_name, positions)
>>> vertex_ids.shape
(5,)

Set mesh vertices for a 3D problem with 5 mesh vertices.

>>> positions = np.array([[1, 1, 1], [2, 2, 2], [3, 3, 3], [4, 4, 4], [5, 5, 5]])
>>> positions.shape
(5, 3)
>>> mesh_name = "MeshOne"
>>> vertex_ids = participant.set_mesh_vertices(mesh_name, positions)
>>> vertex_ids.shape
(5,)

Set mesh vertices for a 3D problem with 5 mesh vertices, where the positions are a list of tuples.

>>> positions = [(1, 1, 1), (2, 2, 2), (3, 3, 3), (4, 4, 4), (5, 5, 5)]
>>> positions.shape
(5, 3)
>>> mesh_name = "MeshOne"
>>> vertex_ids = participant.set_mesh_vertices(mesh_name, positions)
>>> vertex_ids.shape
(5,)
start_profiling_section(event_name)

Starts a profiling section with the given event name.

Parameters

event_namestr

Name of the event to profile.

Examples

Start a profiling section with the event name “EventOne”:

>>> event_name = "EventOne"
>>> participant.start_profiling_section(event_name)
stop_last_profiling_section()

Stops the last profiling section.

Examples

Stop the last profiling section:

>>> participant.stop_last_profiling_section()
write_and_map_data(mesh_name, data_name, coordinates, values)

This function writes values at temporary locations to data of a mesh. As opposed to the writeData function using VertexIDs, this function allows to write data via coordinates, which don’t have to be specified during the initialization. This is particularly useful for meshes, which vary over time. Note that using this function comes at a performance cost, since the specified mapping needs to be computed locally for the given locations, whereas the other variant (writeData) can typically exploit the static interface mesh and pre-compute data structures more efficiently.

Values are passed identically to write_data.

Parameters

mesh_namestr

name of the mesh to write to.

data_namestr

Data name to write to.

coordinatesarray_like

The coordinates of the vertices in a numpy array [N x D] where N = number of vertices and D = dimensions of geometry.

valuesarray_like

Values of data

Examples

Write scalar data for a 2D problem with 5 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> coordinates = np.array([[c1_x, c1_y], [c2_x, c2_y], [c3_x, c3_y], [c4_x, c4_y], [c5_x, c5_y]])
>>> values = np.array([v1, v2, v3, v4, v5])
>>> participant.write_and_map_data(mesh_name, data_name, coordinates, values)

Write scalar data for a 2D problem with 5 vertices, where the coordinates are provided as a list of tuples, and the values are provided as a list of scalars:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> coordinates = [(c1_x, c1_y), (c2_x, c2_y), (c3_x, c3_y), (c4_x, c4_y), (c5_x, c5_y)]
>>> values = [v1, v2, v3, v4, v5]
>>> participant.write_and_map_data(mesh_name, data_name, coordinates, values)
write_data(mesh_name, data_name, vertex_ids, values)

This function writes values of specified vertices to data of a mesh. Values are provided as a block of continuous memory defined by values. Values are stored in a numpy array [N x D] where N = number of vertices and D = dimensions of geometry. The order of the provided data follows the order specified by vertices.

Parameters

mesh_namestr

name of the mesh to write to.

data_namestr

Data name to write to.

vertex_idsarray_like

Indices of the vertices.

valuesarray_like

Values of data

Notes

Previous calls:

count of available elements at values matches the configured dimension * size count of available elements at vertex_ids matches the given size initialize() has been called

Examples

Write scalar data for a 2D problem with 5 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> values = np.array([v1, v2, v3, v4, v5])
>>> participant.write_data(mesh_name, data_name, vertex_ids, values)

Write vector data for a 2D problem with 5 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> values = np.array([[v1_x, v1_y], [v2_x, v2_y], [v3_x, v3_y], [v4_x, v4_y], [v5_x, v5_y]])
>>> participant.write_data(mesh_name, data_name, vertex_ids, values)

Write vector data for a 3D (D=3) problem with 5 (N=5) vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> values = np.array([[v1_x, v1_y, v1_z], [v2_x, v2_y, v2_z], [v3_x, v3_y, v3_z], [v4_x, v4_y, v4_z], [v5_x, v5_y, v5_z]])
>>> participant.write_data(mesh_name, data_name, vertex_ids, values)

Write vector data for a 3D (D=3) problem with 5 (N=5) vertices, where the values are provided as a list of tuples:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2, 3, 4, 5]
>>> values = [(v1_x, v1_y, v1_z), (v2_x, v2_y, v2_z), (v3_x, v3_y, v3_z), (v4_x, v4_y, v4_z), (v5_x, v5_y, v5_z)]
>>> participant.write_data(mesh_name, data_name, vertex_ids, values)
write_gradient_data(mesh_name, data_name, vertex_ids, gradients)

Writes gradient data given as block. This function writes gradient values of specified vertices to a dataID. Values are provided as a block of continuous memory. Values are stored in a numpy array [N x D] where N = number of vertices and D = number of gradient components.

Parameters

mesh_namestr

Name of the mesh to write to.

data_namestr

Data name to write to.

vertex_idsarray_like

Indices of the vertices.

gradientsarray_like

Gradient values differentiated in the spatial direction (dx, dy) for 2D space, (dx, dy, dz) for 3D space

Notes

Previous calls:

Count of available elements at values matches the configured dimension Count of available elements at vertex_ids matches the given size Initialize() has been called Data with dataID has attribute hasGradient = true

Examples

Write gradient vector data for a 2D problem with 2 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2]
>>> gradients = np.array([[v1x_dx, v1y_dx, v1x_dy, v1y_dy], [v2x_dx, v2y_dx, v2x_dy, v2y_dy]])
>>> participant.write_gradient_data(mesh_name, data_name, vertex_ids, gradients)

Write gradient vector data for a 2D problem with 2 vertices, where the gradients are provided as a list of tuples:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2]
>>> gradients = [(v1x_dx, v1y_dx, v1x_dy, v1y_dy), (v2x_dx, v2y_dx, v2x_dy, v2y_dy)]
>>> participant.write_gradient_data(mesh_name, data_name, vertex_ids, gradients)

Write vector data for a 3D problem with 2 vertices:

>>> mesh_name = "MeshOne"
>>> data_name = "DataOne"
>>> vertex_ids = [1, 2]
>>> gradients = np.array([[v1x_dx, v1y_dx, v1z_dx, v1x_dy, v1y_dy, v1z_dy, v1x_dz, v1y_dz, v1z_dz], [v2x_dx, v2y_dx, v2z_dx, v2x_dy, v2y_dy, v2z_dy, v2x_dz, v2y_dz, v2z_dz]])
>>> participant.write_gradient_data(mesh_name, data_name, vertex_ids, gradients)
cyprecice.check_array_like(argument, argument_name, function_name)
cyprecice.get_version_information()

Returns

Current preCICE version information

Migration Guide for Python language bindings for preCICE version 2.0

Steps to move from old Python API to the new API

1. Python language bindings moved to a new repository in the preCICE Project

Previously, the Python language bindings were part of the repository precice/precice. The bindings have now been moved to the independent repository precice/python-bindings.

The installation procedure is the same as before. Please refer to the README.

2. New initialization of Interface

The initialization of the Interface object now initializes the solver and also configures it using the configuration file provided by the user.

Old: Before preCICE Version 2 you had to call:

interface = precice.Interface(solverName, processRank, processSize)
interface.configure(configFileName)

New: The two commands have now been combined into a single one:

interface = precice.Interface(solverName, configFileName, processRank, processSize)

3. Reduced number of inputs arguments for API calls

Unlike the old bindings, API calls now do not need the array size to be passed as an argument anymore. The bindings directly take the size of the array that you are providing.

For example let us consider the call write_block_vector_data:

Old: The previous call was:

interface.write_block_vector_data(writeDataID, writeDataSize, vertexIDs, writeDataArray)

New: The new function call is:

interface.write_block_vector_data(writeDataID, vertexIDs, writeDataArray)

The same change is applied for all other calls which work with arrays of data.

4. API functions use a return value, if appropriate

In older versions of the python bindings arrays were modified by the API in a call-by-reference fashion. This means a pointer to the array was passed to the API as a function argument. This approach was changed and the API functions now directly return the an array.

For example let us consider the interface function set_mesh_vertices. set_mesh_vertices is used to register vertices for a mesh and it returns an array of vertexIDs.

Old: The old signature of this function was:

vertexIDs = np.zeros(numberofVertices)
interface.set_mesh_vertices(meshID, numberofVertices, grid, vertexIDs)

Note that vertexIDs is passed as an argument to the function.

New: This has now been changed to:

vertexIDs = interface.set_mesh_vertices(meshID, grid)

Here, vertexIDs is directly returned by set_mesh_vertices.

The same change has been applied to the functions read_block_scalar_data and read_block_vector_data.

5. Consequently use numpy arrays as data structure

We consequently use numpy arrays for storing array data (multidimensional lists are still accepted). As an example, the N coupling mesh vertices of a mesh in D dimensions are represented as grid = np.zeros([N, D]). Previous versions of the bindings used either grid = np.zeros([N, D]) (transposed version) or grid = np.zeros(N*D). The same rule applies for data written and read in write_block_vector_data and read_block_vector_data.

Guide to release new version of python-bindings

The developer who is releasing a new version of the python-bindings is expected to follow this work flow:

The release of the python-bindings repository is made directly from a release branch called python-bindings-v2.1.1.1. This branch is mainly needed to help other developers with testing.

  1. Create a branch called python-bindings-v2.1.1.1 from the latest commit of the develop branch.

  2. Open a Pull Request master <– python-bindings-v2.1.1.1 named after the version (i.e. Release v2.1.1.1) and briefly describe the new features of the release in the PR description.

  3. Bump the version in the following places:

    • CHANGELOG.md on python-bindings-v2.1.1.1.

    • There is no need to bump the version anywhere else, since we use the python-versioneer for maintaining the version everywhere else.

  4. Optional test the py-pyprecice Spack package using spack dev-build py-pyprecice@develop.

  5. Draft a New Release in the Releases section of the repository page in a web browser.

    • The release tag needs to be the exact version number (i.e.v2.1.1.1 or v2.1.1.1rc1, compare to existing tags).

    • If this is a stable release, use @target:master. If this is a pre-release, use @target:python-bindings-v2.1.1.1. If you are making a pre-release, directly skip to the pre-release section below.

    • Release title is also the version number (i.e. v2.1.1.1 or v2.1.1.1rc1, compare to existing releases).

  6. As soon as one approving review is made, merge the release PR (from python-bindings-v2.1.1.1) into master.

  7. Merge master into develop for synchronization of develop.

  8. If everything is in order up to this point then the new version can be released by hitting the “Publish release” button in your release Draft. This will create the corresponding tag and trigger publishing the release to PyPI.

  9. Now there exists be a tag corresponding to the release on master. Re-run the docker release workflow build-docker.yml via dispatch such that the correct version is picked up by versioneer. Check the version in the container via docker pull precice/python-bindings, then docker run -ti precice/python-bindings, and inside the container $ python3 -c "import precice; print(precice.__version__)". ⚠️ There is an open issue that needs fixing https://github.com/precice/python-bindings/issues/195 ⚠️

  10. Add an empty commit (details https://github.com/precice/python-bindings/issues/109) on master by running the steps:

    git checkout master
    git commit --allow-empty -m "post-tag bump"
    git push
    

    Check that everything is in order via git log. Important: The tag and origin/master should not point to the same commit. For example:

    commit 44b715dde4e3194fa69e61045089ca4ec6925fe3 (HEAD -> master, origin/master)
    Author: Benjamin Rodenberg <benjamin.rodenberg@in.tum.de>
    Date:   Wed Oct 20 10:52:41 2021 +0200
    
        post-tag bump
    
    commit d2645cc51f84ad5eda43b9c673400aada8e1505a (tag: v2.3.0.1)
    Merge: 2039557 aca2354
    Author: Benjamin Rodenberg <benjamin.rodenberg@in.tum.de>
    Date:   Tue Oct 19 12:57:24 2021 +0200
    
        Merge pull request #132 from precice/python-bindings-v2.3.0.1
    
        Release v2.3.0.1
    

    For more details refer to https://github.com/precice/python-bindings/issues/109 and https://github.com/python-versioneer/python-versioneer/issues/217.

  11. Temporarily not maintained Update the py-pyprecice Spack package.

Pre-release

After creating the branch and drafting a release, directly hit the “Publish release” button in your Release Draft. Please note that the release branch is not merged into the master branch during a pre-release. Merging is done only for the stable release. You can check the pre-release artifacts (e.g. release on PyPI) of the release. No further action is required for a pre-release.