Deadlock with custom MPI communicators in JIT
Issue #977
resolved
The following code used to work with SWIG, but deadlocks since SWIG was removed:
from dolfin import *
comm = mpi_comm_self()
world = mpi_comm_world()
if world.rank == 0:
mesh = UnitIntervalMesh(comm, 2)
V = FunctionSpace(mesh, "CG", 1)
bc = DirichletBC(V, 0.0, "on_boundary")
With SWIG, I get
[pefarrell@aoibheann:/tmp]$ mpiexec -n 2 python demo2.py
[pefarrell@aoibheann:/tmp]$
Since SWIG's removal, I get
[pefarrell@aoibheann:/tmp]$ mpiexec -n 2 python3 demo2.py
Calling FFC just-in-time (JIT) compiler, this may take some time.
and it hangs.
Comments (4)
-
-
- changed status to open
-
@chris_richardson : This is probably the reason why the
test_mpi_dependent_jiting
unit test fails -
reporter - changed status to resolved
Allow CompiledSubDomain to take in an MPI communicator.
In DirichletBC, pass the communicator from the mesh to the CompiledSubDomain (if you have one).
Fixes issue
#977.→ <<cset 4b1cad4e9d97>>
- Log in to comment
The problem is that this if statment branch does not pass the MPI Comm to CompiledSubDomain (and CompiledSubDomain must be extended to be able to receive a comm)