test failure with openmpi on ppc64le
This is with mpi4py-3.0.2.
https://koji.fedoraproject.org/koji/taskinfo?taskID=35481045:
BUILDSTDERR: ======================================================================
BUILDSTDERR: ERROR: testCreateF90ComplexDouble (test_datatype.TestDatatype)
BUILDSTDERR: ----------------------------------------------------------------------
BUILDSTDERR: Traceback (most recent call last):
BUILDSTDERR: File "test/test_datatype.py", line 324, in testCreateF90ComplexDouble
BUILDSTDERR: self.check_datatype(None, factory, *args)
BUILDSTDERR: File "test/test_datatype.py", line 144, in check_datatype
BUILDSTDERR: newtype = factory(*args)
BUILDSTDERR: File "mpi4py/MPI/Datatype.pyx", line 309, in mpi4py.MPI.Datatype.Create_f90_complex
BUILDSTDERR: mpi4py.MPI.Exception: MPI_ERR_ARG: invalid argument of some other kind
BUILDSTDERR: ======================================================================
BUILDSTDERR: ERROR: testCreateF90RealDouble (test_datatype.TestDatatype)
BUILDSTDERR: ----------------------------------------------------------------------
BUILDSTDERR: Traceback (most recent call last):
BUILDSTDERR: File "test/test_datatype.py", line 306, in testCreateF90RealDouble
BUILDSTDERR: self.check_datatype(None, factory, *args)
BUILDSTDERR: File "test/test_datatype.py", line 144, in check_datatype
BUILDSTDERR: newtype = factory(*args)
BUILDSTDERR: File "mpi4py/MPI/Datatype.pyx", line 300, in mpi4py.MPI.Datatype.Create_f90_real
BUILDSTDERR: mpi4py.MPI.Exception: MPI_ERR_ARG: invalid argument of some other kind
BUILDSTDERR: ----------------------------------------------------------------------
BUILDSTDERR: Ran 1102 tests in 31.862s
BUILDSTDERR: FAILED (errors=2, skipped=46)
I have no idea if those types should be supported there…
Comments (12)
-
-
Any news about this issue? Can you switch stuff in the spec file and test MPICH first?
-
reporter MPICH works fine. The problem only occurs with openmpi and only on one architecture…
-
@Jeff Squyres Here you have another one.
-
What's the output from
ompi_info
– are the Fortran interfaces supported? -
The test for
MPI_Type_create_f90_integer
passed, so I guess that Fortran interfaces are indeed supported. There are other tests that also pass, here you have the relevant ones (only Real/Complex Double are failing, Real/Complex Single seem to pass):BUILDSTDERR: testCreateF90ComplexDouble (test_datatype.TestDatatype) ... ERROR BUILDSTDERR: testCreateF90Integer (test_datatype.TestDatatype) ... ok BUILDSTDERR: testCreateF90RealDouble (test_datatype.TestDatatype) ... ERROR BUILDSTDERR: testCreateF90RealSingle (test_datatype.TestDatatype) ... ok BUILDSTDERR: testCreatef90ComplexSingle (test_datatype.TestDatatype) ... ok
-
For reference, build time tests (including
testCreateF90ComplexDouble
,testCreateF90RealDouble
) have been working for ppc64el on Debian,
https://buildd.debian.org/status/logs.php?pkg=mpi4py&arch=ppc64el -
reporter $ ompi_info Package: Open MPI mockbuild@buildvm-ppc64le-16.ppc.fedoraproject.org Distribution Open MPI: 4.0.1 Open MPI repo revision: v4.0.1 Open MPI release date: Mar 26, 2019 Open RTE: 4.0.1 Open RTE repo revision: v4.0.1 Open RTE release date: Mar 26, 2019 OPAL: 4.0.1 OPAL repo revision: v4.0.1 OPAL release date: Mar 26, 2019 MPI API: 3.1.0 Ident string: 4.0.1 Prefix: /usr/lib64/openmpi Configured architecture: powerpc64le-unknown-linux-gnu Configure host: buildvm-ppc64le-16.ppc.fedoraproject.org Configured by: mockbuild Configured on: Fri Jun 21 04:37:50 UTC 2019 Configure host: buildvm-ppc64le-16.ppc.fedoraproject.org Configure command line: '--prefix=/usr/lib64/openmpi' '--mandir=/usr/share/man/openmpi-ppc64le' '--includedir=/usr/include/openmpi-ppc64le' '--sysconfdir=/etc/openmpi-ppc64le' '--disable-silent-rules' '--enable-builtin-atomics' '--enable-mpi-java' '--enable-mpi1-compatibility' '--with-sge' '--with-valgrind' '--enable-memchecker' '--with-hwloc=/usr' '--with-libevent=external' '--with-pmix=external' 'CC=gcc' 'CXX=g++' 'LDFLAGS=-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld' 'CFLAGS= -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection' 'CXXFLAGS= -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection' 'FC=gfortran' 'FCFLAGS= -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection' Built by: mockbuild Built on: Fri Jun 21 04:49:32 UTC 2019 Built host: buildvm-ppc64le-16.ppc.fedoraproject.org C bindings: yes C++ bindings: no Fort mpif.h: yes (all) Fort use mpi: yes (full: ignore TKR) Fort use mpi size: deprecated-ompi-info-value Fort use mpi_f08: yes Fort mpi_f08 compliance: The mpi_f08 module is available, but due to limitations in the gfortran compiler and/or Open MPI, does not support the following: array subsections, direct passthru (where possible) to underlying Open MPI's C functionality Fort mpi_f08 subarrays: no Java bindings: yes Wrapper compiler rpath: runpath C compiler: gcc C compiler absolute: /usr/bin/gcc C compiler family name: GNU C compiler version: 9.1.1 C++ compiler: g++ C++ compiler absolute: /usr/bin/g++ Fort compiler: gfortran Fort compiler abs: /usr/bin/gfortran Fort ignore TKR: yes (!GCC$ ATTRIBUTES NO_ARG_CHECK ::) Fort 08 assumed shape: yes Fort optional args: yes Fort INTERFACE: yes Fort ISO_FORTRAN_ENV: yes Fort STORAGE_SIZE: yes Fort BIND(C) (all): yes Fort ISO_C_BINDING: yes Fort SUBROUTINE BIND(C): yes Fort TYPE,BIND(C): yes Fort T,BIND(C,name="a"): yes Fort PRIVATE: yes Fort PROTECTED: yes Fort ABSTRACT: yes Fort ASYNCHRONOUS: yes Fort PROCEDURE: yes Fort USE...ONLY: yes Fort C_FUNLOC: yes Fort f08 using wrappers: yes Fort MPI_SIZEOF: yes C profiling: yes C++ profiling: no Fort mpif.h profiling: yes Fort use mpi profiling: yes Fort use mpi_f08 prof: yes C++ exceptions: no Thread support: posix (MPI_THREAD_MULTIPLE: yes, OPAL support: yes, OMPI progress: no, ORTE progress: yes, Event lib: yes) Sparse Groups: no Internal debug support: no MPI interface warnings: yes MPI parameter check: runtime Memory profiling support: no Memory debugging support: no dl support: yes Heterogeneous support: no mpirun default --prefix: no MPI_WTIME support: native Symbol vis. support: yes Host topology support: yes IPv6 support: no MPI1 compatibility: yes MPI extensions: affinity, cuda, pcollreq FT Checkpoint support: no (checkpoint thread: no) C/R Enabled Debugging: no MPI_MAX_PROCESSOR_NAME: 256 MPI_MAX_ERROR_STRING: 256 MPI_MAX_OBJECT_NAME: 64 MPI_MAX_INFO_KEY: 36 MPI_MAX_INFO_VAL: 256 MPI_MAX_PORT_NAME: 1024 MPI_MAX_DATAREP_STRING: 128 MCA allocator: bucket (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA allocator: basic (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA backtrace: execinfo (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA btl: tcp (MCA v2.1.0, API v3.1.0, Component v4.0.1) MCA btl: usnic (MCA v2.1.0, API v3.1.0, Component v4.0.1) MCA btl: self (MCA v2.1.0, API v3.1.0, Component v4.0.1) MCA btl: vader (MCA v2.1.0, API v3.1.0, Component v4.0.1) MCA btl: uct (MCA v2.1.0, API v3.1.0, Component v4.0.1) MCA btl: openib (MCA v2.1.0, API v3.1.0, Component v4.0.1) MCA compress: bzip (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA compress: gzip (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA crs: none (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA dl: dlopen (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA event: external (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA hwloc: external (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA if: linux_ipv6 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA if: posix_ipv4 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA installdirs: env (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA installdirs: config (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA memchecker: valgrind (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA memory: patcher (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA mpool: hugepage (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA patcher: overwrite (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA pmix: isolated (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pmix: flux (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pmix: ext3x (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pstat: linux (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rcache: grdma (MCA v2.1.0, API v3.3.0, Component v4.0.1) MCA reachable: weighted (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA shmem: mmap (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA shmem: sysv (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA shmem: posix (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA timer: linux (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA dfs: test (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA dfs: app (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA dfs: orted (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA errmgr: default_tool (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA errmgr: default_app (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA errmgr: default_hnp (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA errmgr: default_orted (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA ess: tm (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA ess: env (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA ess: tool (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA ess: pmi (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA ess: singleton (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA ess: slurm (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA filem: raw (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA grpcomm: direct (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA iof: tool (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA iof: orted (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA iof: hnp (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA notifier: syslog (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA odls: pspawn (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA odls: default (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA oob: tcp (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA plm: tm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA plm: slurm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA plm: isolated (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA plm: rsh (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA ras: gridengine (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA ras: slurm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA ras: simulator (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA ras: tm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA regx: reverse (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA regx: fwd (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA rmaps: mindist (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rmaps: resilient (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rmaps: round_robin (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rmaps: ppr (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rmaps: seq (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rmaps: rank_file (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rml: oob (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA routed: debruijn (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA routed: radix (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA routed: direct (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA routed: binomial (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA rtc: hwloc (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA schizo: flux (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA schizo: slurm (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA schizo: orte (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA schizo: ompi (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA state: orted (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA state: novm (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA state: tool (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA state: hnp (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA state: app (MCA v2.1.0, API v1.0.0, Component v4.0.1) MCA bml: r2 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: monitoring (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: sm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: libnbc (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: basic (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: inter (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: tuned (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: sync (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA coll: self (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fbtl: pvfs2 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fbtl: posix (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fcoll: individual (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fcoll: dynamic (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fcoll: two_phase (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fcoll: vulcan (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fcoll: dynamic_gen2 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fs: pvfs2 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA fs: ufs (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA io: ompio (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA io: romio321 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA mtl: ofi (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA osc: ucx (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA osc: rdma (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA osc: pt2pt (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA osc: monitoring (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA osc: sm (MCA v2.1.0, API v3.0.0, Component v4.0.1) MCA pml: v (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pml: ucx (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pml: monitoring (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pml: ob1 (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA pml: cm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA rte: orte (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA sharedfp: sm (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA sharedfp: lockedfile (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA sharedfp: individual (MCA v2.1.0, API v2.0.0, Component v4.0.1) MCA topo: basic (MCA v2.1.0, API v2.2.0, Component v4.0.1) MCA topo: treematch (MCA v2.1.0, API v2.2.0, Component v4.0.1) MCA vprotocol: pessimist (MCA v2.1.0, API v2.0.0, Component v4.0.1)
-
Any updates in this issue? Could you please apply this patch and try again?
https://bitbucket.org/mpi4py/mpi4py/commits/7de7b5c34dc3e5c42f3ab6687690978405a393ae
-
Still seeing the same errors I think: https://koji.fedoraproject.org/koji/taskinfo?taskID=38210768 (scratch build - so the logs will only be around for a week or so)
-
This bug is now affecting Debian builds, in each of ppc64el, powerpc and ppc64, logs at https://buildd.debian.org/status/package.php?p=mpi4py
(e.g. https://buildd.debian.org/status/fetch.php?pkg=mpi4py&arch=ppc64el&ver=3.0.3-2&stamp=1578647733&raw=0 )Note this is with mpi4py 3.0.3, with commit 7de7b5c already applied.
The openmpi build log at https://buildd.debian.org/status/fetch.php?pkg=openmpi&arch=mips64el&ver=4.0.2-5&stamp=1576686190&raw=0 shows that openmpi was built with fortran support, and
ompi_info
confirms it:Fort compiler: gfortran Fort compiler abs: /usr/bin/gfortran Fort ignore TKR: yes (!GCC$ ATTRIBUTES NO_ARG_CHECK ::) Fort 08 assumed shape: yes Fort optional args: yes Fort INTERFACE: yes Fort ISO_FORTRAN_ENV: yes Fort STORAGE_SIZE: yes Fort BIND(C) (all): yes Fort ISO_C_BINDING: yes Fort SUBROUTINE BIND(C): yes Fort TYPE,BIND(C): yes Fort T,BIND(C,name="a"): yes Fort PRIVATE: yes Fort PROTECTED: yes Fort ABSTRACT: yes Fort ASYNCHRONOUS: yes Fort PROCEDURE: yes Fort USE...ONLY: yes Fort C_FUNLOC: yes Fort f08 using wrappers: yes Fort MPI_SIZEOF: yes C profiling: yes C++ profiling: yes Fort mpif.h profiling: yes Fort use mpi profiling: yes Fort use mpi_f08 prof: yes
The last successful build on ppc64el ( https://buildd.debian.org/status/fetch.php?pkg=mpi4py&arch=ppc64el&ver=3.0.3-1&stamp=1573797288&raw=0) used
gfortran-9
9.2.1-19
openmpi3.1.3-11+b1
The current failing build uses
gfortran-9 9.2.1-23
openmpi 4.0.2-5The jump from OpenMPI 3 to OpenMPI 4 sounds like the likely culprit.
-
- changed status to closed
Archived. Resubmit at https://github.com/mpi4py/mpi4py/issues if appropriate/needed.
- Log in to comment
I don’t see a reason for them to not be supported, but maybe that is the case if Open MPI is not built with Fortran support. I would suggest to ping upstream.