Unpickling error when using comm.recv to receive a message sent using comm.isend

Issue #128 closed
Alberto Scotti created an issue

In the code, a bunch of MPI processes send to the root process Python objects using comm.isend(message, dest=0, tag=77)

The root process does uses a for loop to receive the messages as in

for F in Active:

‌ (Bin, Xin) = comm.recv(source=F[1], tag=77)

Active is a list that contains the ID of the processes from which we are expecting messages. The code runs fine on our testing platform (Python 3.6.8, pickleshare=0.7.4), but on our campus cluster, running Python 3.6.6, pickleshare 0.7.4, we get the following error.

File "./PyPhysics.py", line 369, in GatherSlice
(Bin, Xin)= comm.recv(source=F[1], tag=77)
File "mpi4py/MPI/Comm.pyx", line 1173, in mpi4py.MPI.Comm.recv
File "mpi4py/MPI/msgpickle.pxi", line 302, in mpi4py.MPI.PyMPI_recv
File "mpi4py/MPI/msgpickle.pxi", line 268, in mpi4py.MPI.PyMPI_recv_match
File "mpi4py/MPI/msgpickle.pxi", line 111, in mpi4py.MPI.Pickle.load
File "mpi4py/MPI/msgpickle.pxi", line 101, in mpi4py.MPI.Pickle.cloads
_pickle.UnpicklingError: invalid load key, '\xc2'.

Is this a bug?

Comments (9)

  1. Lisandro Dalcin

    Are you waiting at some point for the requests each comm.isend() call returns?

    Are you running with Intel MPI on your cluster? If yes, then look at issue #102.

  2. Alberto Scotti reporter

    I tried using the mpi4py.rc.recv_mprobe = False, now I get

    _pickle.UnpicklingError: unpickling stack underflow

    Right now, I am using mvapich2 2.3. I can try openmpi. I also forgot to mention that development was done on a gcc machine, where things work. Finally, not sure if this matter, but the misbehaving script is invoked by an interpreter that runs within a C++ code.

  3. Lisandro Dalcin

    Well, that error and the previous one just mean that the messages are getting corrupted. Can you also try setting mpi4py.rc.threads = False ?

  4. Lisandro Dalcin

    Who is calling MPI_Init() ? The C++ code? Is the C++ code and mpi4py using the same MPI implementation?

  5. Alberto Scotti reporter

    Tried with mpi4py.rc.threads=False, and still get the unpickling stack underflow error.

    The C++ calls MPI_Init().

    As far as the MPI implementation, mpi4py was installed with

    pip3 install --user mpi4py

    If I understand how it works, that should have compiled mpi4py using my mpi environment.

  6. Alberto Scotti reporter

    For the moment, we have implemented a workaround that does not rely on pickled objects (just numpys)

  7. Log in to comment