compute several desired eigenvalue for a generalized eigenvalue problem

Issue #32 new
Wu created an issue

Hello, I have a generalized eigenvalue problem Aq = wBq. Sparse matrix A and B are shown below, neither are symmetry.

matrix A:Screenshot from 2018-04-04 10-34-57.png

matrix B: Screenshot from 2018-04-04 10-35-43.png

My question is how can I compute only several interested eigenvalue? I used the following code, where the sigma is the desired a eigenvalue, and I want to compute for example 15 eigenvalues around Sigma. As I tried, the code seems to compute the largest imaginary of the overall eigenvalue spectrum.

import sys, slepc4py
slepc4py.init(sys.argv)
from petsc4py import PETSc    
from slepc4py import SLEPc

opts = PETSc.Options()
kk = 15

# create petsc sparse matrix (createAIJ) and assigh lhs and rhs data
A = PETSc.Mat().createAIJ(size=LHS.shape,
                               csr=(LHS.indptr, LHS.indices,
                                    LHS.data),comm=PETSc.COMM_WORLD)
A.assemble()
B = PETSc.Mat().createAIJ(size=RHS.shape,
                               csr=(RHS.indptr, RHS.indices,
                                    RHS.data),comm=PETSc.COMM_WORLD)
B.assemble()

# create SLEPc Eigenvalue solver
E = SLEPc.EPS().create(PETSc.COMM_WORLD)
E.setOperators(A,B)
E.setDimensions(kk,PETSc.DECIDE)     # set number of  eigenvalues to compute
E.setWhichEigenpairs(E.Which.LARGEST_IMAGINARY)
E.setProblemType(SLEPc.EPS.ProblemType.GNHEP)    # generalized non-Hermitian eigenvalue problem

E.setTolerances(tol,500)   # set tolerance 
if sigma is not None:
    E.setTarget(sigma)     # set the desired eigenvalue 
E.setTarget(sigma) 
if v0 is not None:
    E.setInitialSpace(v0)  # set the initial vector

st = E.getST()
st.setType('sinvert')

E.setFromOptions()
E.solve()

Comments (13)

  1. Wu reporter

    Hi Jose, thanks for your quick reply. I Tried you suggestion, and it works. The computed eigenvalue is correct compared with scipy.eigs. But another problem is the computing time is way too long than scipy.eigs.

    slepc: 1585.7s

    scipy.eigs: 120.9s

    Do you have any suggestion to improve the speed? Thanks

  2. Jose E. Roman

    Run with -log_view to see the time breakdown of the different steps. Most probably the most time consuming part is the factorization. This should be improved by installing PETSc with an external package for the factorization. For instance, configure PETSc with --download-suitesparse and run with the option -st_pc_factor_mat_solver_package umfpack. Another option is MUMPS. See section 3.4.1 of SLEPc's users manual.

    Note: -st_pc_factor_mat_solver_package is renamed to -st_pc_factor_mat_solver_type in PETSc/SLEPc 3.9

  3. Wu reporter

    Is it possible to set

    -st_pc_factor_mat_solver_package umfpack 
    

    inside the python code?
    The reason I change from scipy.eigs to slepc is that scipy.eigs use direct spuer LU for factorization, which always result in momory error when my eigenvalue problem is large. So does this external PETSc package also use direct factorization? If so, I think it would also end up with memory error.

  4. Jose E. Roman

    You can do this: opts.setValue('-st_pc_factor_mat_solver_package','umfpack').

    Yes, you will probably run out of memory. But with slepc4py you can run in parallel on a small cluster. Use MUMPS in this case.

  5. Wu reporter

    Hello Jose, thanks again for your instruction. I succeed with MUMPS, and the computing time is reduced to comparable as scipy.eigs. But when I try parallel run with mpirun, I would get error like:

    WARNING! There are options you set that were not used!
    WARNING! could be spelling mistake, etc!
    Option left: name:-st_pc_factor_mat_solver_package value: mumps
    

    Dose it mean MUMPS do not support parallel?

  6. Jose E. Roman

    It should work in parallel. Add -eps_view to see if MUMPS is being picked for the factorization.

  7. Wu reporter

    Hi Jose, the past days I tested slepsc with single and multiple processors. A strange thing happens. When I run my code with single mode, the solver would in background utilizing 4 processors, which is efficient and fast. Shown as bellow. And this is even faster than running with mpirun (with more processors!) plot_slepc_mumps_L0.png

    But maybe after I recompiled either mpich or petsc, (sorry I don't know which is the exact cause), running slepc4py would end up using only one processor. This make the computing very slow.

    Do you have any idea, what would be possible reason for this?

  8. Jose E. Roman

    SLEPc can use threads if PETSc has been configured with a multi-threaded BLAS/LAPACK.

    To run with MPI make sure you use the mpiexec corresponding to the same MPI implementation that was used for configuring PETSc.

  9. Wu reporter

    Thanks Jose. This helps, I re-compiled my PETSc, and it now runs with multi-threading. Another problem I have is, I will use slepsc to solve my eigevalue problem, just replace the original scipy.eigs code. For my original code, I used mpi4py to run my eigenValue problem jobs with multiple processors. Which means I have to use mpi4py. But when I incorporate the slepsc code with mpi4py, the processor stuck without computing. Do you have any hint for this?

  10. Wu reporter

    Hi Jose, I tested solving eigenvalue problem separately without mpi4py. The code works indeed as the examples do.

    But my case is bit different, I have to incorporate slepsc solver into my own code, where mpi4py is used for multi-processing. But as I introduce mpi4py in to the above mentioned test code, and running the eigenvalue problem for certain processor, the code always stuck without reporting any error nor do anything.

    So I would like to ask, in my case, how should I deal with it?

  11. Lisandro Dalcin

    Are you sure you built mpi4py with the same MPI used for PETSc/SLEPc and petsc4py/slepc4py?

    While petsc4py/slepc4py do not require mpi4py, the three packages can be used together. You either have a bug in your code, or some issue with the backend MPI, or some misunderstanding.

    At this point, you should send us a minimal self-contained example that reproduce your issue, so I can take a look on my side to figure out what's going on.

  12. Log in to comment