Initialization error with petsc4py (complex-scalar build)

Issue #68 resolved
Francis Lacombe created an issue

Hi,

I built PETSc with the option "--with-scalar-type=complex", I have attached a copy of my config anyway. The problem arise when I try to initialize PETSc. For example, when I try to run the following command

import sys, petsc4py

petsc4py.init(sys.argv)

I get the following error

Traceback (most recent call last):

File "<stdin>", line 1, in <module>

File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/init.py", line 42, in init

PETSc = petsc4py.lib.ImportPETSc(arch)

File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/init.py", line 29, in ImportPETSc

return Import('petsc4py', 'PETSc', path, arch)

File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/init.py", line 64, in Import

module = imp.load_module(fullname, fo, fn, stuff)

File "/usr/lib/python3.5/imp.py", line 242, in load_module

return load_dynamic(name, filename, file)

File "/usr/lib/python3.5/imp.py", line 342, in load_dynamic

return _load(spec)

File "<frozen importlib._bootstrap>", line 693, in _load

File "<frozen importlib._bootstrap>", line 666, in _load_unlocked

File "<frozen importlib._bootstrap>", line 577, in module_from_spec

File "<frozen importlib._bootstrap_external>", line 914, in create_module

File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed

ImportError: libpetsc.so.3.7: cannot open shared object file: No such file or directory

I don't get this error when I build PETSc without the complex-scalar flag... but I need this feature.

If someone could help me with this issue it would be very appreciated !

Thanks,

Francis

Comments (9)

  1. Francis Lacombe reporter

    I've installed the libpetsc-complex-3.7.5 package

    Now, I still have an error but not the same

    Traceback (most recent call last):

    File "<stdin>", line 1, in <module>

    File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/init.py", line 42, in init

    PETSc = petsc4py.lib.ImportPETSc(arch)
    

    File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/init.py", line 29, in ImportPETSc

    return Import('petsc4py', 'PETSc', path, arch)
    

    File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/init.py", line 64, in Import

    module = imp.load_module(fullname, fo, fn, stuff)
    

    File "/usr/lib/python3.5/imp.py", line 242, in load_module

    return load_dynamic(name, filename, file)
    

    File "/usr/lib/python3.5/imp.py", line 342, in load_dynamic

    return _load(spec)
    

    File "<frozen importlib._bootstrap>", line 693, in _load

    File "<frozen importlib._bootstrap>", line 666, in _load_unlocked

    File "<frozen importlib._bootstrap>", line 577, in module_from_spec

    File "<frozen importlib._bootstrap_external>", line 914, in create_module

    File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed

    ImportError: /home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/arch-linux2-c-debug/PETSc.cpython-35m-x86_64-linux-gnu.so: undefined symbol: TaoLMVMSetH0

  2. Lisandro Dalcin

    I cannot reproduce your issue (it was fixed before petsc4py-3.7.0). Are you sure you are using petsc4py-3.7.0? Any chance you have PETSC_ARCH defined in your environment? If you define PETSC_ARCH, set it to the one corresponding to your build of PETSc with complex scalars.

  3. Francis Lacombe reporter

    Yes, I have the right PETSC_ARCH pointing to my complex scalars build.

    Ok, I found out some stuff that might be relevant:

    When I run the command

    /configure --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --with-clanguage=c --download-fblaslapack --download-openmpi --with-scalar-type=complex

    I don't get the same behaviour depending on how I install petsc4py

    For example, when I use sudo apt-get install python3-petsc4py, I get the following error :

    [Thor:09893] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored)

    [Thor:09893] mca_base_component_repository_open: unable to open mca_shmem_posix: /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored)

    [Thor:09893] mca_base_component_repository_open: unable to open mca_shmem_mmap: /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored)

    [Thor:09893] mca_base_component_repository_open: unable to open mca_shmem_sysv: /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored)


    It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer):

    opal_shmem_base_select failed

    --> Returned value -1 instead of OPAL_SUCCESS



    It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer):

    opal_init failed

    --> Returned value Error (-1) instead of ORTE_SUCCESS



    It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer):

    ompi_mpi_init: ompi_rte_init failed

    --> Returned "Error" (-1) instead of "Success" (0)


    *** An error occurred in MPI_Init_thread

    *** on a NULL communicator

    *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,

    *** and potentially your MPI job)

    [Thor:9893] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!

    And if I use pip3 install petsc4py, I get the following error :

    Traceback (most recent call last):

    File "<stdin>", line 1, in <module>

    File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/init.py", line 42, in init

    PETSc = petsc4py.lib.ImportPETSc(arch)
    

    File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/init.py", line 29, in ImportPETSc

    return Import('petsc4py', 'PETSc', path, arch)
    

    File "/home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/init.py", line 64, in Import

    module = imp.load_module(fullname, fo, fn, stuff)
    

    File "/usr/lib/python3.5/imp.py", line 242, in load_module

    return load_dynamic(name, filename, file)
    

    File "/usr/lib/python3.5/imp.py", line 342, in load_dynamic

    return _load(spec)
    

    File "<frozen importlib._bootstrap>", line 693, in _load

    File "<frozen importlib._bootstrap>", line 666, in _load_unlocked

    File "<frozen importlib._bootstrap>", line 577, in module_from_spec

    File "<frozen importlib._bootstrap_external>", line 914, in create_module

    File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed

    ImportError: /home/frlacf/.local/lib/python3.5/site-packages/petsc4py/lib/arch-linux2-c-debug/PETSc.cpython-35m-x86_64-linux-gnu.so: undefined symbol: TaoLMVMSetH0

  4. Lisandro Dalcin

    OK, I'm looking at your configure log, and you are doing things in a way that eventually will mess-up thing. My recomendations list follows:

    1) Remove all petsc builds you ever did, including any libpetsc package you installed with apt-get. Remove petsc4py as well, and make sure that running python3 -c "import petsc4py" fails with import error. Remove all MPI packages in your system with apt, that is, anything named openmpi or mpich. In short may sure you remove all traces of MPI, PETSc, and petsc4py in your system, including stuff beneath your ~/.local directory. Double check the contents of the environment variable LD_LIBRARY_PATH.

    2) Now install fresh an MPI of your choice with apt. I recommend MPICH. Install also blas/lapack. In a recent Ubuntu this is usually done with apt-get mpich libmpich-dev libblas-dev liblapack-dev. This way you have the minimal set of packages, no need to --download MPI or blas/lapack.

    3) Configure PETSc with ./configure --with-scalar-type=complex. Note I'm not using --download options for MPI or blas/lapack, PETSc's configure should find and use the MPI and blas/lapack in your system. Note also I'm not configuring to build with C++, this is nor required, and I advise against it. Building with C rather than C++ reduce the chances of messing-up things. Then run make all and finally make test to quickly check the build.

    3) Download the petsc4py tarball, unpack it, cd petsc4py-x.y.z. Then export PETSC_DIR=<dir> PETSC_ARCH=<arch> to the appropriate values, and finally 'python3 setup.py install --user'.

    4) Now try to run python3 -m petsc4py.help. Does it work?

  5. Francis Lacombe reporter

    Thank you very much !

    Now it works.

    I have a question though. This time I downloaded both PETSc and petsc4py from bitbucket. The last time I downloaded PETSc from github and petsc4py from the pip3 command and it didnt work.

    So my question is, what is the most up to date and reliable version? Isn't it supposed to be all the same ?

    Thanks for your help,

    Francis

  6. Francis Lacombe reporter

    In brief, I completely removed PETSc and petsc4py from my computer and started over. (Thanks to Lisandro for noticing my messed up build)

  7. Lisandro Dalcin

    @FLacombe You can use PETSc from github (mirror repo) or bitbucket (official repo). However, if you pip install petsc4py, it will attempt to install the release tarball, and that is only compatible with the 'maint' branch of the petsc repo, so you should first git checkout maint. To use the in-development versions of petsc and petsc4py, you can use the default master branch of petsc, and quickly install a compatible petsc4py with pip install https://bitbucket.org/petsc/petsc4py/get/master.tar.gz, or clone the petsc4py repo and run python setup.py install --user.

  8. Log in to comment