Matt Knepley  committed 67c4136

Personal Review for 2013

  • Participants
  • Parent commits 6200f64

Comments (0)

Files changed (1)

File credentials/PersonalReview13.tex

+  AagaardKnepleyWilliams13,
+  KreienkampEtAl2013,
+  BruneKnepleyScott2013,
+  ZhengEtAl13b,
+  KnepleyTerrel2013,
+  msk2013,
+  KarpeevKnepleyBrune13,
+  ZhengEtAl13,
+  KnepleyYuen13,
+PETSc Tutorials:
+  PETSc Tutorial, Imperial College, London, March 2014
+  PETSc Tutorial, Minnesota Supercomputing Institute, University of Minnesota, Minneapolis MN, September 2013
+  Crustal Deformation Modeling Tutorial Week, June 2013
+  Advanced PETSc Tutorial, Maison de la Simulation, Orsay, France June 2013
+Invited Talks:
+  Nonlinear Preconditioning in PETSc, Oxford University, Oxford UK March 2014
+  Nonlinear Preconditioning in PETSc, Imperial College, London UK March 2014
+  Runtime Configurability in PETSc, SIAM PP, Portland OR February 2014
+  Algorithms for Exascale Computational Mesoscience, ExaMath13 Workshop, Wash. D.C. August 2013
+  Finite Element Integration using CUDA and OpenCL, GPU-SMP 13, Changchun, China July 2013
+  The Process of Computational Science, Maison de la Simulation, Orsay, France June 2013
+  Nested and Hierarchical Solvers in PETSc, SIAM CS\&E, Boston, February 2013
+  APAM Colloquium, Columbia University, New York February 2013
+Meetings Organized:
+  <li>I co-organized (with Margarete Jadamec) the CIG <i>Implementing Solvers in CitcomCU and CitcomS Workshop</i>,
+    September 16-17, 2013 in Davis, CA.</li>
+  <li>I co-organized (with Dave Yuen, Jed Brown and others)
+      the <a href="">International Workshop of GPU Solutions to Multiscale
+      Problems in Science and Engineering</a>, July 29-August 2, 2013 in Changchun, Jilin, China.</li>
+  We have recently release PyLith 2.0,, the premier package for large
+scale parallel simulation of crustal deformation, both dynamic and quasi-static. The new release fully supports the
+PETSc DMPlex interface for parallel unstructured meshes, including support for parallel refinement, prismatic cohesive
+cells, and submesh selection.
+  Support for scalable Stokes preconditioning, even in the face of large viscosity variations, is now solid. The pTatin3d
+code from Prof. Dave May at ETH Zurich, based upon PETSc, has demonstrated mesh independent convergence up
+to 10K processors and hundreds of millions of unknowns. This same technology is now being applied to the much more
+complex magma dynamics problem. Early results in TerraFERMA from Prof. Marc Spiegelman at Columbia, also based
+upon PETSc, show mesh independent convergence for sequential systems.
+  Nonlinear preconditioning really works (Barry supply verbiage).
+  DMPlex is now a fully functional PETSc component, which supports most popular mesh formats (ExodusII, CGNS,
+Gmsh, Triangle) and generators (Triangle, Tetgen, Cubit). It has been used in a range of problems, including modeling
+the earthquake cycle, first principles fracture initiation and propagation, power systems modeling, and atmospheric
+climate modeling. The Firedrake project at ICL has selected DMPlex as the basis of its FEM implementation, as has the
+GRINS projection from ICES.
+  We have run small scale experiments using parallel, unstructured multigrid for FEM systems. For example, SNES ex12
+shows the p-Laplacian problem without regularization (using SVD on the coarse level). These will be extended, in
+collaboration with the MOOSE team at INL, the LibMesh team at ICES UT Austin, and Paul Baumann at SUNY Buffalo. We
+expect scalable results by the December review.
+  Support for FEM abstractions, including the flexible definition of complex elements, has been added to PETSc. This allows
+the user to define an element, which together with an imported or generated mesh can be used to automatically construct
+a parallel data layout (Vec and Mat), refined spaces, interpolation and restriction operators, and multilevel solvers.
+  Preliminary support for FVM has been added. TS ex11 shows a second order TVD FVM with limiters, reconstruction, and
+flexible physics routines. In both our FEM and FVM examples, the user specifies both physics and boundary conditions only
+as pointwise functions, eliminating the unnecessary dependence on details of the mesh, discretization, or solver. This is
+an important step towards an independent library of physical problems.