This is a stand alone version of yt's Parallel HOP that reads GDF (http://yt.enzotools.org/wiki/GridDataFormat) files. Stephen Skory email@example.com --- Requirements --- - Python > 2.6 (has not been tested with 3+) - Numpy > 1.4 http://numpy.scipy.org/ - mpi4py (1.1.0, 1.2.1 work) http://code.google.com/p/mpi4py/ - Forthon http://hifweb.lbl.gov/Forthon/ --- Installation --- Follow the include instructions for the packages, beginning with Python, then Numpy, and then mpi4py and Forthon. It may be worth investigating using the Enthought Python distribution (http://www.enthought.com/products/epd.php) which comes with Numpy and many other packages already built-in. Of the four, mpi4py is probably the most difficult to build. Please see the mpi4py user group linked off the page above for assistance, or for easy questions, the yt users list. On some systems, it may be necessary to install the `python2.6-mpi' executable, and call that when running parallel Python scripts. Once all the packages are installed, the kD-tree shared object needs to be built by invoking the `make' command in the `kdtree' directory. If everything works fine, there should be a file named `fKDpy.so' in that directory when the compilation is finished. Nothing more needs to be done! --- Included Files --- main.py: This file contains the settings that need to be changed when running Parallel HOP. This is also the file that gets called, `mpirun -n 8 python ./main.py'. parallel_hop.py: This is the main source of Parallel HOP. This is nearly identical to the version in yt, minus various yt-specific performance timing hooks and print statements. The functions in here are called by main.py. mpi_tools.py: This is a subset of the parallel tools written for yt that parallel HOP needs. kdtree: This directory contains the Fortran kD-tree implementation used by Parallel HOP. --- Using Parallel HOP --- 1. The top of the file `main.py' has some configurations that control the input files, how Parallel HOP is run, and the output files. 2. Run Parallel HOP in a way something like this: `mpirun -n 8 python ./main.py'. 3. The output(s) may be several: - `halos.out' - A text file giving the (index, number of particles, center of mass, maximum density, maximum density location, total mass and maximum radius) for each halo. - zregroup files - A set of binary files, one per MPI task, that gives the halo tag and HOP density for each particle. Default: Off. - h5 files - Similar to above, but in HDF5 format. Default: On. --- Future Improvements --- - Calculate the halo bulk velocities, velocity dispersions. - Bug fixing. Let the author know of any bugs! === Changelog === (newest at top) 8 Sept 2010 - Initial version & repo commit.