Wiki

Clone wiki

Sunrise / HIPACCProjects

This page describes the Sunrise student projects for the University of California HIPACC (High-Performance AstroComputing Center) 2010 Summer School. (They might alo be useful for others who want some introduction to using Sunrise.)

There is very limited time to do Sunrise projects due to the scheduling of the lectures, so they are all quite limited in scope. The goal is to allow you to learn the basics of how to use Sunrise, what its capabilities are, and how to interpret the outputs. You can choose as many of these as you want, though I encourage you to do at least one of 1-2 before progressing to the later ones as this will give you an opportunity to concentrate on the radiation transfer aspects before you have to deal with the hydro code outputs.

Projects

  1. A spherical cloud with a central point source. Use Sunrise to calculate the emerging spectral energy distribution of an optically thick spherical cloud illuminated by a central point source, and explore how this depends on the properties of the cloud.
  2. An externally illuminated cloud. This is similar to the above project but deals with a dark cloud illuminated by an external radiation field.
  3. Images and spectra of a galaxy simulated with GADGET. In this project you will take the output from a GADGET simulation and process it with Sunrise. If you have run GADGET previously in the school, you can use your own output, otherwise there is one provided for you.
  4. Images and spectra of a galaxy simulated with GASOLINE. This is the same as the previous one except that it uses GASOLINE instead.

(Unfortunately Sunrise can't read outputs from ART, ENZO or RAMSES at this point.)

Setup and background

Sunrise depends an several external libraries and is kind of a chore to build from scratch. Building it yourself is a "useful exercise" since after the school you'll have to do so, but our time here is limited enough you'll use a precompiled executable on triton. To use the Sunrise executables on triton, you need to:

  • add /home/hipacc-2/bin to your PATH
  • add /home/hipacc-2/lib:/home/hipacc-2/lib/boost to your LD_LIBRARY_PATH.
  • set the environment variable UNITSFILE to /home/hipacc-2/share/units.dat
  • load the modules intel/11.1 and /home/sfdong-ucsc/privatemodules/python
  • to use Patrik's python code, you also need to add /home/hipacc-2/patrik/python:/home/hipacc-2/lib64/python2.4/site-packages to your PYTHONPATH Add all this to your .bash_profile.

The Sunrise output files are FITS files (see McrxConfigAndOutputFileFormat and BroadbandConfigAndOutputFileFormat for detailed specifications). These can be analyzed with whatever FITS file reader you want. This guide uses python to read the files. Start the interactive python shell, which also provides plotting capabilities, with ipython --pylab. If you are new to python, the official python documentation at http://www.python.org/doc is a good resource for general python questions. The most immediate thing to know about ipython is its excellent history feature: typing something and then hitting the up/down arrows will cycle back and forth through every command you've given starting with that. Very useful!

There are generally two common things that you'll look for in the outputs: images and spectra. The images are in FITS extensions called CAMERA<i>-type. The number <i> specifies which camera the output is for (as a run can contain any number of viewpoints, or cameras, from which the simulated object is observed), and type indicates what type of output the camera contains. For example, CAMERA0-IR is the dust emission reaching the first camera. The images are all data cubes, ie 3d arrays where each slice is a different wavelength (or filter band). These images are in SI surface brightness units, so "W/(m*m^2*sr) where the first m denotes that the quantities are in lambda units, ie per wavelength. (In general, all Sunrise units are SI except for astro units like kpc and yr. Deal with it! :-)

The spectra can be obtained by summing the images over the spatial dimensions, but for convenience the Sunrise main executable does this and saves it to a FITS table called INTEGRATED_QUANTITIES. This contains a field "lambda" with the wavelength array, and fields called "L_lambda_xxx0" etc. for the output spectra

Plotting outputs

Before jumping into the actual projects, here's a short example of how to load and plot the outputs with ipython.

[hipacc-2@tcc-2-18 test]$ ipython --pylab
Python 2.4.3 (#1, Sep  3 2009, 15:37:37) 
Type "copyright", "credits" or "license" for more information.

IPython 0.10 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object'. ?object also works, ?? prints more.

  Welcome to pylab, a matplotlib-based Python environment.
  For more information, type 'help(pylab)'.
First, we import the pyfits module and open the file.
In [1]: import pyfits

In [2]: f=pyfits.open('crap.fits')
The file object has a method info() which shows the extensions in the file and what they contain.
In [3]: f.info()
Filename: crap.fits
No.    Name         Type      Cards   Dimensions   Format
0    PRIMARY     PrimaryHDU       6  ()            uint8
1    CAMERA0-STAR  ImageHDU        27  (300, 300, 61)  float32
2    CAMERA1-STAR  ImageHDU        27  (300, 300, 61)  float32
3    CAMERA2-STAR  ImageHDU        27  (300, 300, 61)  float32
4    INTEGRATED_QUANTITIES  BinTableHDU     24  61R x 4C      [D, D, D, D]
We want to load the contents of camera 0. The extensions are accessed by indexing the file object by the extension name (or number):
In [4]: im=f['CAMERA0-STAR'].data
The array im now contains the image data. We can look at the shape of this array.
In [5]: im.shape
Out[5]: (61, 300, 300)
The first axis is the wavelength/filter axis. Now we want to show an image of wavelength slice 10, with log scaling:
In [6]: imshow(log10(im[10,:,:]))
After a short delay you should now see the image window pop up. The image window supports some interactive features, most notably clicking the disk icon saves the current contents as an image file. The plotting capabilities are supplied by the matplotlib module, for more info on usage see the matplotlib home page.

Plotting a spectrum is equally easy. We load the wavelength vector into l and the spectrum into s and plot lambda*L_lambda in a log-log plot. clf() clears the current figure.

In [19]: l=f['INTEGRATED_QUANTITIES'].data.field('lambda')

In [20]: s=f['INTEGRATED_QUANTITIES'].data.field('L_lambda')

In [21]: clf()

In [22]: loglog(l,l*s)
Out[22]: [<matplotlib.lines.Line2D object at 0x15f89e50>]

That's the basics of loading the outputs. The numpy array package supports the usual element-wise operations plus much more advanced stuff. See the numpy homepage for details.

Color images

I have a package make_color that creates rgb color composite images using the algorithm of Lupton et al. Here's an example of how to use it:

In [1]: import pyfits, make_color

In [2]: f=pyfits.open('broadband_034.fits')

In [3]: imshow(make_color.make_image(transpose(f['CAMERA3-BROADBAND'].data, axes=(1,2,0)),return_jpeg=False, band='28,26,24',scale=(1.5,1,1.5))/255)

Out[3]: <matplotlib.image.AxesImage object at 0x19c82910>

You should now see a nice image. Sorry for the convoluted syntax of make_image(). The parameter band specifies which slices of the image should be used for the R, G and B components of the image. In this example, it's the z, r and u SDSS filters. (Of course, the numbers have to be within the number of slices of the image that you get with .shape attribute, otherwise it'll say "Invalid index".) The scale parameter sets the prescaling of the image components, so this depends on the wavelength and the object. If you set it to "auto", it will autoscale all 3 components, if you set it to "autolum", it will autoscale the 3 components by one common factor (thus preserving color balance). For different wavelengths, you have to play with it to get something that looks good. The 3 numbers printed when you use autoscale are the scale values used, so that gives you a starting point. You can also use the keyword autopercentile which determines which percentile of the image should be at the saturation point. The default value is 5e-2, which should be reasonable, but increasing this value will brighten the image. (There are more parameters, look at the source in /home/hipacc-2/patrik/python/make_color.py if you are interested.)

Notes on running the calculations

All of the calculations you will run here use a fair amount of memory and take a substantial amount of time (tens of minutes to hours). You should not run them on the login nodes on Triton. Even if it works, it will slow it down for everyone else. Likely, it will crash. (And if all 50 of you start a run, I'm sure Joel and Anatoly will be getting angry calls from SDSC...) Use an interactive session to the job queue, or submit them with a job file.

Project 1: A spherical cloud with a central point source

This setup is one of the standard test cases that are included with the Sunrise code. It consists of a central blackbody point source surrounded by a spherical distribution of dust of uniform density. In the limit that the dust is very optically thick and has blackbody characteristics (ie, no wavelength dependence of absorption), it's simple to calculate the emerging spectrum by using Stefan-Boltzmann's law and conservation of luminosity. With realistic dust grains this is not the case, and the goal of this project is to investigate how the emerging spectrum depends on parameters like the optical depth through the cloud.

To run this case, you use the executable ir_thick_test. (If you are interested, the source for this example is in sunrise/test/ir_thick_test.cc.) Using --help shows the parameters that you can supply on the command line. Most can probably be left at their defaults, at least initially. (The exception is --grain_data_dir which should be set to /home/hipacc-2/dust_data/crosssections so it can find the opacity files on triton.) The most relevant ones are:

  • source_temp is the temperature of the central source (in K).
  • source_lum is the luminosity of the central source (in L_sun).
  • radius is the size of the cloud (in AU).
  • tau is the V-band optical depth to the central source.

The name of the output file is specified on the command line and contains data cubes called CAMERAi-STAR, the source radiation, and CAMERAi-IR, which is the dust emission. There is also an INTEGRATED_QUANTITIES extension which contain the integrated SEDs of the stellar and dust emission.

Assignment: Run the default case. Study how the appearance of the cloud changes with wavelength. Can you understand this variation based on what you know about the equation of radiative transfer? Now vary some parameter (like the source parameters or the optical depth of the cloud). Before looking at the outputs, predict how this should affect the SED and appearance of the cloud. Does it behave as you expected it to?

Project 2: An externally illuminated cloud

This is similar to the above project but deals with a dark cloud illuminated by an external UV/visual radiation field. This roughly models the "infrared dark clouds" seen by Spitzer and Herschel. The radiation transfer characteristics here are very different from that of a central source, and the object is to study how the SED and appearance of these clouds depend on the structure of the cloud.

This case is run with the executable cloud. (If you are interested, the source for this example is in sunrise/test/cloud.cc). Like above, --help shows the parameters available. Most can be left at their defaults initially. (The exception is --grain_data_dir which should be set to /home/hipacc-2/dust_data/crosssections so it can find the opacity files on triton.) The most relevant parameters are:

  • intensity is the intensity of the external radiation field, in units of U, the local interstellar radiation field.
  • gridsize is the extent of the grid (in pc), so determines how much low-density material is around the cloud.
  • rsz is the cloud scale radius in the z-direction (in pc).
  • xzr is the ratio of x- and z- cloud scale radii.
  • yzr is the ratio of y- and z- cloud scale radii.
  • rpow is the exponent of the cloud density radial dependence.
  • rho0 is the central cloud dust density (in Msun/pc^3).
  • rho_bg is the background dust density filling the entire grid (in Msun/pc^3).

The name of the output file is specified on the command line and contains data cubes called CAMERAi-STAR, the source radiation, and CAMERAi-IR, which is the dust emission. There is also an INTEGRATED_QUANTITIES extension which contain the integrated SEDs of the stellar and dust emission.

Assignment: Run the default case. Study how the appearance of the cloud changes with wavelength and how the SED changes with position, from the center to the edge of the cloud. Can you understand this variation based on what you know about the equation of radiative transfer? Now vary some parameter (like the central density, size or density profile of the cloud). Before looking at the outputs, predict how this should affect the SED and appearance of the cloud. Does it behave as you expected it to?

Project 3: Images and spectra of a galaxy simulated with GADGET

In this project you will take the output from a GADGET simulation and process it with Sunrise. There is an example representing the Sbc galaxy from Jonsson, Groves & Cox, 2010 that you can download in the "Downloads" section of this Wiki. If you have run your own Gadget simulation, you can use a snapshot from your own run, but this makes it more complicated. (If you do, you need to change the simpar file to point to the parameter file from you Gadget run. If your simulation was started with existing galaxies, you need to change the ic_xxx keywords in sfrhist.stub to be appropriate for that galaxy and make the ic_file<i> keywords point to snapshots of the isolated galaxies. See SfrhistConfigAndOutputFileFormat for detailed descriptions of these keywords. If your galaxy is very different from the Sbc example, you will also need to convince yourself that the results are converged, see ConvergenceTests.)

Now you are ready to run. At this point you should read SunriseOverview to get an idea of how running Sunrise works. The example contains the .config files that should be fed to the executables. (The one for broadband is called postprocess-xxx.config for legacy reasons.) It's recommended you cut down the number of rays to 10k or something as a start, just to make sure the code runs before starting a full run. (You'll have to create job files to submit the jobs to TORQUE on Triton yourself.)

When you have completed the broadband run, you will have two output files, as described above. You can now start analyzing these outputs.

Assignment:

How do you think the spectra (the ones including dust) and optical images of the galaxy should change as a function of inclination? After making the prediction, take a look. Does it behave as you expected? Does the dust always make a galaxy fainter?

Now look at the appearance of the galaxy as a function of wavelength. How does it change?

Identify the wavelength slice containing the H-alpha line. Extract a spatially resolved image of the H-alpha luminosity and correlate this against the image of star-formation rate (one of the slices of the AUX images). Is this what you'd expect from Kennicutt's 98 paper?

Project 4: Images and spectra of a galaxy simulated with GASOLINE

This is the same as the previous one except that it uses GASOLINE instead. To get a snapshot if you don't have one, talk to Fabio or Tom. There are small differences in the sfrhist configuration when running Gasoline snapshots. Fabio's SunriseBootCamp should have enough info to get you started. Otherwise, things should be the same as for the Gadget example, and the assignment is, too.

Updated