Deadlock with custom MPI communicators in JIT

Issue #977 resolved
Patrick Farrell created an issue

The following code used to work with SWIG, but deadlocks since SWIG was removed:

from dolfin import *
comm = mpi_comm_self()
world = mpi_comm_world()

if world.rank == 0:
    mesh = UnitIntervalMesh(comm, 2)
    V = FunctionSpace(mesh, "CG", 1)
    bc = DirichletBC(V, 0.0, "on_boundary")

With SWIG, I get

[pefarrell@aoibheann:/tmp]$ mpiexec -n 2 python demo2.py 
[pefarrell@aoibheann:/tmp]$ 

Since SWIG's removal, I get

[pefarrell@aoibheann:/tmp]$ mpiexec -n 2 python3 demo2.py 
Calling FFC just-in-time (JIT) compiler, this may take some time.

and it hangs.

Comments (4)

  1. Tormod Landet

    @chris_richardson : This is probably the reason why the test_mpi_dependent_jiting unit test fails

  2. Patrick Farrell reporter

    Allow CompiledSubDomain to take in an MPI communicator.

    In DirichletBC, pass the communicator from the mesh to the CompiledSubDomain (if you have one).

    Fixes issue #977.

    → <<cset 4b1cad4e9d97>>

  3. Log in to comment