Mumps solver segfaults

Issue #453 wontfix
Mikael Mortensen created an issue

Following code segfaults on master. The default LUSolver runs with no problems.

from dolfin import *
mesh = UnitSquareMesh(100,100)
V = FunctionSpace(mesh, "CG", 1)
u = TrialFunction(V)
v = TestFunction(V)
bc = DirichletBC(V, 0, "on_boundary")
A, b = assemble_system(inner(grad(u), grad(v))*dx, v*dx, bc)
sol = LUSolver("mumps")
u_ = Function(V)
sol.solve(A, u_.vector(), b)  # < segfaults

Error in `/home/mikael/.hashdist/bld/profile/kqc6o23ygvky/bin/python2.7': double free or corruption (out): 0x0000000002b5c240 Process received signal

Comments (9)

  1. Prof Garth Wells

    Don't use a buggy solver, i.e. don't use MUMPS.

    You can report a bug to the MUMPS team, but don't hold your breath. Use SuperLU-dist instead.

  2. Mikael Mortensen reporter

    Ok @garth-wells, didn't know it was buggy. I had very good experience with it before, though, so I might just have a look.

  3. Prof Garth Wells

    Problems are often related to (MUMPS bug) + (ParMETIS bug). MUMPS refuse to update to recent ParMETIS.

  4. Prof Garth Wells

    Letting HashDist install ParMETIS and using PETSc will end in poorly. PETSc should install the dependencies because PETSc cherry-picks versions that work together, and in cases applies bug fixes.

  5. Jan Blechta

    @garth-wells, agree about poor maintenance of MUMPS but MUMPS is one of two solvers in DOLFIN with parallel Cholesky. Ironically PETSc wrappers for MUMPS in symmetric mode were buggy until recently.

    Other one is PaStiX. I haven't experience there. Do you think that PaStiX can be a good alternative?

  6. Prof Garth Wells

    I have very good experience with PaStiX for elasticity. It's hybrid threaded/MPI mode works very well, using much less memory than pure MPI.

    A problem is that the PETSc PasTiX wrappers weren't very good or well maintained. Also, getting good performance from PaStiX depends on some tuning of parameters related to BLAS, and last time I checked these weren't exposed in PETSc.

  7. Log in to comment