tests fail in 64-bit dolfin builds
64-bit dolfin (64-bit indexing) can be built by building against 64-bit PETSc. This has recently been done for debian (and Ubuntu).
Some tests fail for the 64-bit build. Test logs can be found at https://ci.debian.net/packages/d/dolfin/unstable/amd64/, look for version 2019.2.0~git20200218.027d9cc-12, e.g. https://ci.debian.net/data/autopkgtest/testing/amd64/d/dolfin/6517526/log.gz
There seem to be a couple of classes of failure.
1) “Out of memory” e.g. in C++ unittests
Run 64-bit C++ unit tests (serial)
Test project /tmp/autopkgtest-lxc.k8ge4uns/downtmp/build.QiR/src/dolfin-unittests
Start 1: unittests
1/1 Test #1: unittests ........................***Failed 1.50 sec
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
unittests is a Catch v1.9.6 host application.
Run with -? for options
-------------------------------------------------------------------------------
Initialise PETSc
-------------------------------------------------------------------------------
/tmp/autopkgtest-lxc.k8ge4uns/downtmp/build.QiR/src/test/unit/cpp/common/SubSystemsManager.cpp:42
...............................................................................
/tmp/autopkgtest-lxc.k8ge4uns/downtmp/build.QiR/src/test/unit/cpp/common/SubSystemsManager.cpp:44: FAILED:
CHECK_NOTHROW( init_petsc() )
due to unexpected exception with message:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to successfully call PETSc function 'VecSetSizes'.
*** Reason: PETSc error code is: 55 (Out of memory).
*** Where: This error was encountered inside /build/dolfin-HRGZgk/dolfin-
2019.2.0~git20200218.027d9cc/dolfin/la/PETScVector.cpp.
*** Process: 0
***
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset: unknown
*** -------------------------------------------------------------------------
2) Unable to call KSPSolve
in some demos (others pass), e.g.
Start 13: demo_multimesh-stokes_mpi
7/48 Test #13: demo_multimesh-stokes_mpi .............. Passed 0.40 sec
Start 15: demo_sym-dirichlet-bc_mpi
8/48 Test #15: demo_sym-dirichlet-bc_mpi ..............***Failed 1.75 sec
Process 0: <Table of size 2 x 1>
Process 1: <Table of size 2 x 1>
Process 0: *** Warning: Using PETSc native LU solver. Consider specifying a more efficient LU solver (e.g.umfpack) if available.
Process 2: <Table of size 2 x 1>
Process 1: *** Warning: Using PETSc native LU solver. Consider specifying a more efficient LU solver (e.g.umfpack) if available.
Process 2: *** Warning: Using PETSc native LU solver. Consider specifying a more efficient LU solver (e.g.umfpack) if available.
terminate called after throwing an instance of 'std::runtime_error'
terminate called after throwing an instance of 'std::runtime_error'
terminate called after throwing an instance of 'std::runtime_error'
what(): what():
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to successfully call PETSc function 'KSPSolve'.
*** Reason: PETSc error code is: 92 (See https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers).
*** Where: This error was encountered inside /build/dolfin-HRGZgk/dolfin-2019.2.0~git20200218.027d9cc/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 2
***
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset: unknown
*** -------------------------------------------------------------------------
[ci-215-65927b02:02396] *** Process received signal ***
[ci-215-65927b02:02396] Signal: Aborted (6)
[ci-215-65927b02:02396] Signal code: (-6)
Given the reference to “possible LU and Cholesky solvers” in the KSPSolve error message, I suspect the latter class might simply mean that 64-bit PETSc doesn’t have the linear solvers that dolfin would otherwise invoke. superlu for instance is not available in the 64 bit build. Would this explain the error?
3) MPI_INIT problems, e.g.
18/49 Test #38: demo_curl-curl_serial .....................***Failed 0.02 sec
*** The MPI_Comm_rank() function was called before MPI_INIT was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[ci-215-65927b02:02093] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!