Doesn't work with PETSc 3.6.1.0?
Hi, Compiling with PETSc 3.6.1.0 gives error: PARMETIS ERROR: The sum of tpwgts for constraint #0 is not 1.0 ===== Modules added: module add cmake module swap PrgEnv-cray PrgEnv-intel module add cray-petsc/3.6.1.0 cray-tpsl/1.5.2 ===== Please show me the point. Thanks!
Comments (14)
-
-
reporter Here is my configuration: ./configure CC=cc CXX=CC --prefix=/cfs/klemming/nobackup/v/vdnguyen/DD2365/fenics-hpc.dist-minimal-0.1alpha02/local --with-pic --disable-boost-tr1 --with-petsc --with-petsc=/opt/cray/petsc/3.6.1.0/real/INTEL/140/haswell --enable-mpi --enable-mpi-io --disable-progress-bar --with-xml --disable-xmltest --with-gts --host=x86_64-unknown-linux-gnu --with-parmetis --with-xml-prefix=/cfs/klemming/nobackup/v/vdnguyen/DD2365/fenics-hpc.dist-minimal-0.1alpha02/local --with-xml-exec-prefix=/cfs/klemming/nobackup/v/vdnguyen/DD2365/fenics-hpc.dist-minimal-0.1alpha02/local --enable-static --enable-ufl
-
No no no, one should never pass any paths etc on a Cray, all linking flags are provided by the loaded modules (and detected by configure)
Also, is this for a xml mesh or a binary one, and which revision of dolfin?
Furthermore, this is not an issue with PETSc, iff the issue is with parmetis
-
reporter Hi, I cloned dolfin-hpc from your branch. This is for binary files. What is the correct way of doing?
-
Assume that you have built dolfin correctly. Is the problem PE dependent? How many vertices do you have per PE?
Parmetis can under certain conditions return empty partitions! The remedy for this is to fiddle with the parameters (MetisInterface.cpp)
-
reporter Hi, It is running on 64 nodes Mesh size: Global number of vertices: 484489 Global number of cells: 2700747
-
Ok that should be fine. Did you try with a proper built dolfin? And try running on different numbers of PE?
-
reporter I searched, removed dolfin.pc and I restated the installation many times but I got the same error. I used the same procedure to install with petsc3.5.2.1 and it was working. However, I would like to used petsc3.6.1 with better iterative solvers...
-
This is again not a PETSc issue but rather a Parmetis thing (provided by the same module)
Can you share the mesh so I can debug?
-
reporter Great! You can clone 'unicorn-turbine' branch of unicorn-hpc. There you can find main.cpp and meshes in sim00. Please let me know if you experience any problem. It should work with the branch 'movingmesh' of dolfin-hpc. I have tested with petsc3.5. Otherwise, you need to update some functions in LinearPDE and DirichletBC classes. Thanks!
-
There you go, you're using an outdated branch...
-
reporter Unfortunately it is not the case. I cloned your main branch and updated some functions following 'movingmesh' branch. Sorry that I did not push the updated version of this branch.
-
reporter Hi, problem is solved now with a clean installation. Thanks!
-
reporter - changed status to resolved
- Log in to comment
How do you configure dolfin?