Wiki

Clone wiki

cosmosis / pymc


PyMC version 2.3

We don't currently recommend using pymc - we suggest the emcee, metropolis, or multinest samplers instead. pymc may be removed from a future version of cosmosis

Authors: Chris Fonnesbeck, Anand Patil, David Huard & John Salvatier

Website: https://github.com/pymc-devs/pymc/tree/2.3

Documentation: http://bit.ly/pymc_docs

PyMC is a python module that implements Bayesian statistical models and fitting algorithms, including Markov chain Monte Carlo. PyMC has been integrated in CosmoSIS to run a single chain on one processor or several chains in parallel on multiple processors with the MPI for Python package mpi4py.

#Usage

cosmosis [ini]
mpirun cosmosis --mpi [ini]
# ini file options

The CosmoSIS ini file should contain the [pipeline], [output] and [module] interface sections together with the following

[runtime]
sampler = pymc

[pymc]
covmat = covmat.txt
; The number of samples to generate, default = 1000
samples = 100000 
adaptive_mcmc = yes
;Gelman Rubin statistic, if Rconverge is not specified then no runtime convergence test is carried out
Rconverge = 1.2
; fraction of samples to burn, default = 0. If fburn > 1 this is taken as total number of steps to burn.
fburn = 0.2 
;With adaptive MCMC the covariance matrix is tuned every nsteps after fburn, default=100
nsteps = 100

Step methods

  • Adaptive MCMC

The AdaptativeMetropolis (AM) sampling algorithm works like a regular Metropolis, with the exception that stochastic parameters are block-updated using a multivariate jump distribution whose covariance is tuned during sampling. In CosmoSIS this tuning is done every nsteps. Although the chain is non-Markovian, i.e. the proposal distribution is asymmetric, it has correct ergodic properties. See (Haario et al., 2001) for details.

In the CosmoSIS ini file if adaptive mcmc = yes the default delay number of steps before tuning begins is 100. If adaptive mcmc = no and a covariance matrix is supplied the delay is set to a large number such that adaptive MCMC will never take place and the chain runs with a normal Metropolis method using the covariance matrix supplied. If a covariance matrix is not provided it will be estimated by PyMC based on the stochastic parameter type. However it is suggested to provide a sensible guess for the covariance, and not rely on the automatic assignment.

  • Metropolis

This step method applies the one-at-a-time Metropolis-Hastings algorithm to the stochastic parameter. In CosmoSIS PyMC is allowed to choose the proposal distribution based on the parameter type. (Should we impose a default Normal dist here?) In the ini file this step method is selected when no covariance matrix is supplied and adaptive_mcmc = no.

Initializing the chain

The starting values for each chain are uniform random numbers from the range of values in values.ini.

Priors

(Is this feature working yet?) There are three types of prior currently available: Gaussian, uniform and exponential. If no prior is given then a uniform prior is assumed for all parameters. To add priors in the ini file:

[runtime]
Prior = priors.ini
where the priors.ini file is e.g.
[cosmological_parameters]
omega_m = normal 0.25 0.05
omega_b = exponential 0.02

Convergence statistics

CosmoSIS has built in on-the-fly convergence diagnostics using the Gelman Rubin test to establish whether or not the chains have converged when more then one chain is running in parallel. In the CosmoSIS ini file set Rconverge =. A commonly used value is Rconverge = 1.1. PyMC can also be used to postprocess the chains output from CosmoSIS to obtain Geweke z-scores or Raftery Lewis criteria.

Updated