Wiki

Clone wiki

blopex / PETScInstallDevel

Introduction

BLOPEX has been moved from PETSc to SLEPc.

It is easiest to test the development versions, if the source control package "GIT" has been installed on the test system. In my instructions below, I assume that it is.

Installing the development versions of PETSc (now without BLOPEX)

Follow http://www.mcs.anl.gov/petsc/developers/index.html

One must set up PETSC_DIR, e.g., in my case

export PETSC_DIR=/home/aknyazev/work/petsc/petsc

Once Git is installed, obtain petsc with the following:

git clone https://bitbucket.org/petsc/petsc

This needs to be done only once. After this initial setup, all changes in petsc source can be obtained by

cd $PETSC_DIR
git pull

During the PETSc installation, you must have set up both PETSC_DIR and PETSC_ARCH enviroment variables. PETSC_ARCH must be used to compile PETSc with different configure options uising a single source directory. PETSC_DIR and PETSC_ARCH are also needed in the next step, the SLEPc installation.

To double-check your environment variables, run

env | grep PETSC
env | grep SLEPC

To test different installation options on only one source directory petsc, open multiple terminal windows and run in separately in an individual window one of the following:

cd $PETSC_DIR 
export PETSC_ARCH=arch-linux2-c-debug-real-int32-float64
./configure --download-hypre
(hypre is not available in 64-bit or complex) or
cd $PETSC_DIR
export PETSC_ARCH=arch-linux2-c-debug-real-int64-float64
./configure --with-64-bit-indices
or
cd $PETSC_DIR 
export PETSC_ARCH=arch-linux2-c-debug-complex-int32-float64
./configure --with-scalar-type=complex
or
cd $PETSC_DIR
export PETSC_ARCH=arch-linux2-c-debug-complex-int64-float64
./configure --with-scalar-type=complex --with-64-bit-indices

Note that there is also a PETSc option --with-precision=single, which is supported by SLEPc, but not yet by BLOPEX.

For configure, add the other usual options that need to be tested. DO NOT attempt to install BLOPEX within PETSc (BLOPEX has already been removed from the latest PETSc version). After configure, compile as usual.

It is NOT safe to run PETSc configure and make in different PETSC_ARCH's in separate terminal windows at the same time! Do it one-by-one.

Installing the development versions of SLEPc with BLOPEX

According to http://slepc.upv.es/download/download.htm, run

git clone https://bitbucket.org/slepc/slepc
e.g., in the same directory where the newly created petsc directory is located. At a later time, the repository can be refreshed simply by:
git pull

As suggested at http://slepc.upv.es/documentation/instal.htm execute

cd slepc
export SLEPC_DIR=$PWD

In my case, I simply use

export SLEPC_DIR=/home/aknyazev/work/petsc/slepc

Then configure and make SLEPc with BLOPEX:

cd $SLEPC_DIR
./configure --download-blopex 
make
make test

It is NOT safe to run SLEPc configure and make in different PETSC_ARCH's in separate terminal windows at the same time! Do it one-by-one.

The SLEPc configure reads PETSc configure parameters using PETSC_DIR, so PETSc configure must be executed first, as in the previous step.

SLEPc's make test picks up from the SLEPc configure if --download-blopex has been used, and if so, run BLOPEX tests. If everything runs correctly, make test should produce no errors.

Installation and Compiling shell script example

 #!/bin/bash  
export PETSC_DIR=/home/aknyazev/work/petsc/petsc
export SLEPC_DIR=/home/aknyazev/work/petsc/slepc
cd $PETSC_DIR
git pull
cd $SLEPC_DIR
git pull
cd $PETSC_DIR
export PETSC_ARCH=arch-linux2-c-debug-real-int32-float64
./configure --download-hypre
make
make test
cd $SLEPC_DIR
./configure --download-blopex 
make
make test

Running SLEPc examples with BLOPEX

BLOPEX-compatible SLEPc examples are listed at http://slepc.upv.es/documentation/current/src/eps/examples/tutorials/index.html and are located in

cd $SLEPC_DIR/src/eps/examples/tutorials
BLOPEX-compatible SLEPc examples are 1-4, 7, 11, and 19. To use BLOPEX in any BLOPEX-compatible SLEPc example, simply add "-eps_type blopex" to the command line. In order to call hypre preconditioning, add also -"st_pc_type hypre -st_pc_hypre_type boomeramg" assuming that PETSc has been configured with hypre.

ex4 - reading a PETSc matrix from a file

ex7 - generalized, reading PETSc matrices from files

./ex7 -help
shows the example-specific command line options:

Solves a generalized eigensystem Ax=kBx with matrices loaded from a file.
This example works for both real and complex numbers.

The command line options are:
  -f1 <filename>, where <filename> = matrix (A) file in PETSc binary form.
  -f2 <filename>, where <filename> = matrix (B) file in PETSc binary form.
  -ninitial <nini>, number of user-provided initial guesses.
  -finitial <filename>, where <filename> contains <nini> vectors (binary).
  -nconstr <ncon>, number of user-provided constraints.
  -fconstr <filename>, where <filename> contains <ncon> vectors (binary).

In order to shift the pencil {A,B} into {A+alpha B,B} use the -st_shift option in SLEPc, for instance (to set alpha=1.0):

./ex7 -f1 fileA -f2 file B -eps_gen_hermitian -eps_type blopex -st_shift 1.0

A comprehensive collection of tests matrices is at $PETSC_DIR/share/petsc/datafiles/matrices/

ex19 - 3D Laplacian

The option -st\_ksp\_type preonly is the default in the following:

./ex19 -da_grid_x 100 -da_grid_y 100 -da_grid_z 100 -eps_type blopex -eps_nev 2 -eps_tol 1e-11 -eps_max_it 10 -st_pc_type hypre -st_pc_hypre_type boomeramg -showtimes -eps_view

For large problems, the code may run out of memory and crash. In this case, add the following extra options

-st_pc_hypre_boomeramg_agg_nl 1 -st_pc_hypre_boomeramg_P_max 4 -st_pc_hypre_boomeramg_interp_type standard 

so that you run

./ex19 -eps_type blopex -st_pc_type hypre -st_pc_hypre_type boomeramg -st_pc_hypre_boomeramg_agg_nl 1 -st_pc_hypre_boomeramg_P_max 4 -st_pc_hypre_boomeramg_interp_type standard -showtimes -eps_view

In the following example, the preconditioner for the BLOPEX eigensolver is the PCG iterative method, preconditioned with hypre's boomeramg:

./ex19 -eps_type blopex -st_ksp_type cg -st_pc_type hypre -st_pc_hypre_type boomeramg -st_pc_hypre_boomeramg_agg_nl 1 -st_pc_hypre_boomeramg_P_max 4 -st_pc_hypre_boomeramg_interp_type standard -showtimes -eps_view

For a high quality preconditioner, such as boomeramg in this case, using PCG on the top of preconditioning is an overkill - the increase in complexity needed to setup the PCG solver does not lead to enough convergence acceleration. This is reversed if the preconditioner is cheap and poor. The direct application of Jacobi below makes lobpcg to converge in ~350 iterations:

 ./ex19 -da_grid_x 100 -da_grid_y 100 -da_grid_z 100 -eps_type blopex -eps_max_it 1000 -st_pc_type jacobi -showtimes -eps_view

while the use of Jacobi-preconditioned PCG with 7 inner iterations makes lobpcg to converge in ~50 iterations:

./ex19 -da_grid_x 100 -da_grid_y 100 -da_grid_z 100 -eps_type blopex -st_ksp_type cg -st_pc_type jacobi -st_ksp_max_it 7 -showtimes -eps_view

The total number of inner-outer iterations is the same ~350 in both runs, but the latter is almost 3 times faster in CPU time, since a single PCG iteration is cheaper than a single LOBPCG iteration.

PETSc command-line options supported, e.g., -log\_summary.

You can compile and run multiple examples and redirect the screen output to a text file run running a script like the following:

 #!/bin/bash
cd $SLEPC_DIR/src/eps/examples/tutorials
mkdir $PETSC_ARCH
make ex7
mv ex7 $PETSC_ARCH
make ex19
mv ex19 $PETSC_ARCH
cd $PETSC_ARCH
./ex7 -f1 $PETSC_DIR/share/petsc/datafiles/matrices/spd-real-int32-float64 -eps_gen_hermitian -eps_type blopex -st_pc_type hypre -st_pc_hypre_type boomeramg > output.txt
./ex19 -eps_type blopex -st_pc_type hypre -st_pc_hypre_type boomeramg >> output.txt

Updated