openmpi segfaults in test_dmplex.py (testClosure on TestPlex_2D_BoxTensor)
Issue #106
resolved
Testing the petc4py 3.9.1 tests with PETSc 3.9.3 (Debian 3.9.3+dfsg1-2), python3 runtests.py
passes all tests (serial).
mpirun -np 2 python3 runtests.py
(openmpi 3.1.1.real-3) passes many tests but fails on test_dmplex.py.
Verbose suggests that testClosure is at fault:
$ mpirun -np 2 python3 test_dmplex.py -v
testAdapt (__main__.TestPlex_1D) ... testAdapt (__main__.TestPlex_1D) ... ok
testAdjacency (__main__.TestPlex_1D) ... ok
testAdjacency (__main__.TestPlex_1D) ... ok
testBoundaryLabel (__main__.TestPlex_1D) ... ok
testBoundaryLabel (__main__.TestPlex_1D) ... ok
testClosure (__main__.TestPlex_1D) ... ok
testClosure (__main__.TestPlex_1D) ... ok
testSectionClosure (__main__.TestPlex_1D) ... ok
testSectionClosure (__main__.TestPlex_1D) ... ok
testSectionDofs (__main__.TestPlex_1D) ... ok
testSectionDofs (__main__.TestPlex_1D) ... ok
testTopology (__main__.TestPlex_1D) ... ok
testTopology (__main__.TestPlex_1D) ... ok
testAdapt (__main__.TestPlex_2D) ... ok
testAdapt (__main__.TestPlex_2D) ... ok
testAdjacency (__main__.TestPlex_2D) ... ok
testAdjacency (__main__.TestPlex_2D) ... ok
testBoundaryLabel (__main__.TestPlex_2D) ... ok
testBoundaryLabel (__main__.TestPlex_2D) ... ok
testClosure (__main__.TestPlex_2D) ... ok
testClosure (__main__.TestPlex_2D) ... ok
testSectionClosure (__main__.TestPlex_2D) ... ok
testSectionClosure (__main__.TestPlex_2D) ... ok
testSectionDofs (__main__.TestPlex_2D) ... ok
testSectionDofs (__main__.TestPlex_2D) ... ok
testTopology (__main__.TestPlex_2D) ... ok
testTopology (__main__.TestPlex_2D) ... ok
testAdapt (__main__.TestPlex_2D_BoxTensor) ... ok
testAdapt (__main__.TestPlex_2D_BoxTensor) ... ok
testAdjacency (__main__.TestPlex_2D_BoxTensor) ... ok
testAdjacency (__main__.TestPlex_2D_BoxTensor) ... ok
testBoundaryLabel (__main__.TestPlex_2D_BoxTensor) ... ok
testBoundaryLabel (__main__.TestPlex_2D_BoxTensor) ... [1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[1]PETSC ERROR: to get more information on the crash.
ok
testClosure (__main__.TestPlex_2D_BoxTensor) ... [grendel:05119] [[27014,0],0] ORTE_ERROR_LOG: Out of resource in file util/show_help.c at line 501
Comments (8)
-
-
reporter -
@knepley The issue is still around.
-
Can you give me stack?
-
@knepley here you have:
#0 0x00007ffff6c41eab in raise () from /lib64/libc.so.6 #1 0x00007ffff6c2c5b9 in abort () from /lib64/libc.so.6 #2 0x00007ffff6c2c491 in __assert_fail_base.cold.0 () from /lib64/libc.so.6 #3 0x00007ffff6c3a612 in __assert_fail () from /lib64/libc.so.6 #4 0x00007fffe3ed65df in Mesh<double>::set_boundary (ids=<optimized out>, facets=<optimized out>, nfacets=<optimized out>, this=0x555555952600) at /usr/include/c++/8/bits/predefined_ops.h:49 #5 pragmatic_set_boundary () at /home/devel/petsc/3.10/arch-packages/externalpackages/git.pragmatic/src/cpragmatic.cpp:251 #6 0x00007fffe7929f03 in DMAdaptMetric_Plex (dm=0x555555956090, vertexMetric=0x555556032630, bdLabel=<optimized out>, dmNew=0x7fffb008fe08) at /home/devel/petsc/3.10/src/dm/impls/plex/plexadapt.c:484 #7 0x00007fffe7ab4ebe in DMAdaptMetric (dm=0x555555956090, metric=0x555556032630, bdLabel=0x0, dmAdapt=dmAdapt@entry=0x7fffb008fe08) at /home/devel/petsc/3.10/src/dm/interface/dm.c:6536 #8 0x00007fffe8b1bde7 in __pyx_pf_8petsc4py_5PETSc_2DM_100adaptMetric (__pyx_v_self=0x7fffb008fbf0, __pyx_v_metric=<optimized out>, __pyx_v_label=0x7ffff7f95508) at src/petsc4py.PETSc.c:226414 #9 __pyx_pw_8petsc4py_5PETSc_2DM_101adaptMetric (__pyx_v_self=0x7fffb008fbf0, __pyx_args=<optimized out>, __pyx_kwds=<optimized out>) at src/petsc4py.PETSc.c:29713
This is the failing test:
This is the relevant Cython code:
-
Has there been any progress on this issue? I'm seeing the same error when I try to call plex.coarsen().
python: /home/***/petsc-3.10.1/arch-linux2-c-debug/externalpackages/git.pragmatic/include/Mesh.h:303: void Mesh<real_t>::set_boundary(int, const int*, const int*) [with real_t = double]: Assertion »facet2id.find(facet)!=facet2id.end()« failed. Abort (core dumped)
I'm using PETSc-3.10.1 and petsc4py-3.10.0.
-
I am bumping the Pragmatic revision to get the refinement which respects boundaries. In the new revision, the OpenMP has been removed so this error should go away.
-
- changed status to resolved
- Log in to comment
@knepley I need you help here, this failure is with petsc/maint and petsc4py/maint. Maybe the test is outdated, can you take a quick look?