imp 2: The impossible project for SAS data reduction procedures, second version

This code is the third or fourth implementation of data correction and 
reduction procedures. Preceding versions were one based in Matlab, followed
by "imp", "sasHDF5" and now "imp2. 

As the name suggests, this variant has an approach that is more similar to 
the "imp" system: every datafile has a configuration file containing all the 
required information for data reduction.

the sasHDF5 code used a big list to create an even bigger HDF5 database with 
every sample measurement in it. While funny, and quite useable for larger 
datasets, it did introduce a single point of failure in the entire procedure: 
the database itself. 

Changes and improvements over the previous imp will be:
  - completely rewritten codebase (and hosted on a git server)
  - offloading the data read-in procedure to "fabio" from Jerome Kieffer
  - offloading the configuration file read-in procedure to ConfigParser
    (which reads text-based configuration files like the Windows .ini files)
  - offloading the integration procedure to the PyFAI project's procedure
  - adding an image distortion correction procedure
  - data storage in SAS2012-compatible HDF5 files (2D) and ascii (1D) files
  - correction steps separated and defined in the "Everything SAXS"-review
    paper (Pauw, B. R., J. Phys: Condensed Matter, 2013)
  - Using "logging" for the logging procedures

The PyFAI code by Jerome Kieffer is much, much quicker for larger datasets. 
It does make it a bit complicated to understand and adapt, due to the focus
on *very fast* correction and integration procedures. 
imp2 will be much slower, but it will be much more adaptable and clear in 
its construction. 

an installed version of the fabio package, and perhaps PyFAI (not sure yet)

--Application proposal--
The idea is to make the corrections work like a small python script, making 
it easy to see the structure of the corrections, but not too hard for beginning
users to apply the corrections. The details of each step, and the associated
parameters are stored internally. 
Thus, the user keeps the overview of the correction procedure but should have
no problem filling in the correction details where necessary.

Parameters can be defined beforehand, or at the correction steps themselves.
Extended definition should be placed beforehand

For example:

#we define some values, for example a transmission factor with 1% relative 
#   uncertainty:
pre = {
    "tfact" : {
        "value" : 0.982,
        "relativeSTD" : 0.1
    "cfact" : {
        "value" : 13.27,
        "relativeSTD" : 0.1,
        "absoluteSTD" : 1.327

#we start by loading the corrector:

#we can load the settings ("pre" defined above) through:

#at any point in the correction procedure, the settings can be overridden
#with fresh arguments to the correctors, as is done below.

#We can load the datafile (through fabIO, which takes care of DS):
impy.DS(fname = 'path/to/dataFile')
#or if your datafile filename has been correctly specified in the settings:

#we can do the Dezingering: DZ, requiring no extra parameters

#we can apply the mask image, define the mask valid pixel window,
#   and the valid detector pixel values (removing pinned pixels):
impy.MK(mfile = 'path/to/maskFile', mwindow = [0.5, 1.5], dwindow = [0, 2**30])

#do the deadtime corrections with 1 and 2 ms pulse deadtime constants
#   this also calculates uncertainties
impy.DT(tau1 = 0.001, tau2 = 0.002)

#correct for flatfield
impy.FF(fffile = 'path/to/flatFile')

#correct for remaining nonlinear response using interpolated values between
#   detected-value : real-value combinations specified in a file:
impy.GA(gfile = 'path/to/gamma/data' )

#correct for darkcurrent
impy.DC(dcparam = [0.001, 1.2e-5, 0, 1200] )

#correct for measurement time
impy.TI(time = 1200)

#correct for flux
impy.FL(flux = 1.38)

#correct for transmission
impy.TM(tfact = 0.89)

#correct for geometric distortion