PETSC Error in VecGetValues() related to boundary conditions

Issue #295 resolved
Mike Welland created an issue

I built a recent dolfin dev version and am encountering a problem with boundary conditions when using multiple cores. I think the problem is with having only some of the partitions (when divided among multiple cores) have the boundary condition applied to them. Below is a an example program.

from dolfin import *

mesh = RectangleMesh(0,0,1,1,10,10)
V = FunctionSpace(mesh, "Lagrange",1)
U = Function(V)
dU = TrialFunction(V)
U_test = TestFunction(V)

left = CompiledSubDomain("(std::abs(x[0]) < DOLFIN_EPS) ")
bottom = CompiledSubDomain("(std::abs(x[1]) < DOLFIN_EPS) ")

#bc = DirichletBC(V,0,left)
bc = DirichletBC(V,0,bottom)
U.interpolate(Constant(0))

F = inner( U*grad(U) , grad(U_test) )*dx
problem = NonlinearVariationalProblem(F, U, bc, derivative(F,U,dU))
solver = NonlinearVariationalSolver(problem)
solver.parameters['nonlinear_solver']='snes'  
solver.solve()

So in 2-d. My square mesh is partitioned into vertical strips for 2 & 3 cores: 2-cores:


| | |

or 3-cores:


| | | |

A boundary condition applied to the left side only works with 1 core, and fails (message below) with two or more since only the left-most part makes contact with that boundary. On the other hand, if the boundary is along the bottom, all three cases work since they all contact the bottom. After 4 cores it fails since the partitions go vertically too.

Oh, and if there is no boundary condition anywhere, everything is hunky-dory.

Can anyone reproduce this on their machine? I see the problem on our linux cluster, with the dev version, but not with 1.3 stable.

This is the error message I get:

[0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Invalid pointer! [0]PETSC ERROR: Null Pointer: Parameter # 3! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.4, Mar, 13, 2014 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named blogin3 by mwelland Thu Apr 24 13:15:44 2014 [0]PETSC ERROR: Libraries linked from /fusion/gpfs/home/mwelland/programs/petsc-3.4.4/arch-linux2-c-debug/lib [0]PETSC ERROR: Configure run at Thu Apr 24 09:25:49 2014 [0]PETSC ERROR: Configure options CFLAGS="-I/soft/mkl/11.0.5.192/mkl/include -m64 -march=native -mavx" CXXFLAGS="-I/soft/mkl/11.0.5.192/mkl/include -m64 -march=native -mavx" FFLAGS="-I/soft/mkl/11.0.5.192/mkl/include -m64 -march=native -mavx" LDFLAGS="-L/soft/mkl/11.0.5.192/mkl/lib/intel64 -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_lp64 -lpthread -lm" --with-shared-libraries --with-mpi-dir=/software/mvapich2-gnu-psm-1.9.5/ --with-boost-dir=/home/mwelland/local_gnu_sequential --with-ptscotch=1 --download-ptscotch=yes --with-mpi4py=1 --with-blas-lapack-dir=/soft/mkl/11.0.5.192/mkl/lib/intel64 --with-numpy=1 --with-scientificpython=1 --with-scientificpython-dir=/home/mwelland/local_gnu_sequential --with-umfpack=1 --download-umfpack=yes --with-valgrind=1 --with-valgrind-dir=/usr/bin/ [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: VecGetValues() line 920 in /fusion/gpfs/home/mwelland/programs/petsc-3.4.4/src/vec/vec/interface/rvector.c Traceback (most recent call last): File "test.py", line 20, in <module> solver.solve() RuntimeError: ------------------------------------------------------------------------- DOLFIN encountered an error. If you are not able to resolve this issue *** using the information listed below, you can ask for help at


*** fenics@fenicsproject.org


Remember to include the error message listed below and, if possible, include a minimal running example to reproduce the error.


------------------------------------------------------------------------- Error: Unable to successfully call PETSc function 'VecGetValues'. Reason: PETSc error code is: 68. Where: This error was encountered inside /home/mwelland/programs/git/dolfin/dolfin/la/PETScVector.cpp. *** Process: unknown


DOLFIN version: 1.3.0+ Git changeset: c458b11cf972bf3033c52e7ff11942dcf905e134 *** -------------------------------------------------------------------------

Thanks, Mike

Comments (14)

  1. Mike Welland reporter

    Hi Jan, I ran the MWE on #263 and it ran without error on 2 cores. Would the MWE produce errors or a strange plot (I don't have plotting enabled on this machine, so I can't check simply).

  2. Jan Blechta

    I don't think the issue is related to #263. I can reproduce it one of my DOLFIN builds which in runtime loads a different build of PETSc than it was compiled again. Try $ ldd /path/to/your/libdolfin.so | grep libpetsc

  3. Mike Welland reporter

    It comes out correctly: libpetsc.so => /home/mwelland/programs/petsc-3.4.4/arch-linux2-c-debug/lib/libpetsc.so

    I also tried with older PETSc versions without success, and compiling the petsc libraries into the same directory as libdolfin.so .

    Do you see it with a normal run with the fenics dev version?

    BTW: I'd be interested in how you change the PETSc version you are calling.

  4. Jan Blechta

    @mwelland Annonounced PETSc mismatch on our system was faked. Hence, I don't know where is the problem and I can reproduce it on one of a little bit DOLFIN masters compiled with debugging-enabled PETSc while cannot reproduce on other 4 builds. Is your PETSc also compiled with --with-debugging=1?

  5. Jan Blechta

    @mwelland

    BTW: I'd be interested in how you change the PETSc version you are calling.

    I don't. I was thinking that there is a bug in our builds or modules but I was wrong.

    You can't do it in principle as far as I know. DOLFIN must be dynamically linked at runtime to PETSc build it is compiled with.

  6. Mike Welland reporter

    Yes, PETSc was compiled with --with-debugging=1 . I will try it with petsc debugging off and see if it makes a difference.

  7. Mike Welland reporter

    Sorry for the late reply. I recompiled petsc without debugging and meanwhile updated to the latest dev versions and now it works fine. Thanks Jan.

  8. Log in to comment