Clone wiki

cosmosis / example_b

Example B: Multinest analysis of CFHTLenS


In this example we will run a nest sampling analysis of the CFHTLenS cosmic shear tomographic data from Heymans et al (2013). The CFHTLenS data measured the weak lensing of galaxies in six redshift bins by intervening matter, and can give us constraints on omega_m, sigma_8, and weaker constraints on other parameters.

Because this is a pretty slow likelihood (code contributions welcome!) we will run it under MPI on multiple processors. The multinest algorithm scales happily to a large number of MPI processes, so although we'll use the example of four here you can use more if you're on a cluster. [1]

If you compiled with OpenMP for demo 1 then first make clean and rebuild:

make clean

Then run the example like this:

mpirun -n 4 cosmosis --mpi examples/example_b.ini

If you're on a cluster or supercomputer then you may need to write a submission script to launch this as a job. This will depend on the specific machine - see you documentation for details.

This job will take a long time to run.

Once it completes you can generate constraints

postprocess examples/example_b.ini -o plots -p example_b

The will generate plots like the one below, which is replicates a figure from Heymans et al.


The ini file examples/example_b.ini describes the structure of the pipeline that calculates the CFHTLenS likelihood.

It selects the sampler, multinest, and the options which apply to that sampler:

sampler = multinest


Unused sections in the ini file are ignored. It then describes where to save the output, before moving on to the pipeline itself:

modules = consistency camb sigma8_rescale halofit load_nz linear_alignment shear_shear add_intrinsic 2pt cfhtlens
values = examples/values_b.ini
likelihoods = cfhtlens
extra_output = 

The long list of modules shows all the different modules that go into this calculation - they are:

  • consistency, to generate a consistent set of parameters
  • camb, to calculate distances and the linear matter power spectrum
  • sigma8_rescale, to re-scale P(k) for the specified sigma_8
  • halofit, to compute non-linear power
  • load_nz to load the CFHTLenS galaxy number density for each z bin
  • linear_alignment, to generate the intrinsic alignment power model
  • shear_shear, to calculate the 2D shear and IA power spectra
  • add_intrinsic, to sum together the II, GI, and GG spectra
  • 2pt, to Bessel-integrate the 2D spectra into 2D correlation functions
  • cfhtlens, to calculate the log-likelihood of the CFHTLenS data given this cosmology

The likelihood we want to extract from the end of all this is listed as cfhtlens which means cosmosis will look for a parameters called cfhtlens_like in the likelihoods section at the end of the run.

You can find out more about all of the inputs and outputs these modules have in our listing of standard library modules.

The inputs to the beginning of the pipeline are descrined in the examples/values_b.ini file. This has various parameters divided into two sections, one for cosmological parameters and one for intrinsic alignments. Some of the parameters are kept fixed and others allowed to vary in a range.

The sampling

The multinest sampler is a nested sampler in which an ensemble of live points scattered throughout the prior space gradually shrinks towards the peak of the posterior. A large number of parameters can be modified in multinest, but the defaults we define here are sensible in cosmosis are usually reasonable and the only parameter you would generally tune is the size of the ensemble.

The outputs of multinest are not actually raw from the posterior - they have a weighting system which one must use to sample from them if to get an equally weighted sample.

The plotting

The cosmosis postprocess program understands how to get a posterior sample from the multinest output, and does so before generating the 1D and 2D constraint plots that are produced by the code.

Each different sampler has a different set of post-processing commands defined for it.

[1] I haven't been able to find from anyone the maximum useful number of cores to use with multinest. If you know this, let us know.