HTTPS SSH

MicroLea - Probabilistic inference in Python

MicroLea (or µLea) is a Python module for modeling discrete probability distributions and making probabilistic inference on them.

MicroLea is a light re-implementation of the Lea module, with the objective of showing as clearly as possible the implementation of the Statues algorithm, as described in the paper "Probabilistic inference using generators - the Statues algorithm".

IMPORTANT NOTE: Unlike usual open-source projects, µLea is NOT aimed to evolve a lot. The unique goal is educational, that is to explain how the Statues algorithm can be implemented and how it works. There are for sure many functions you would like to add: calculation of indicators, random samples, configurable display of probability distributions, etc. However, these will not be added in µLea because its essence is to remain simple and short. If this is blocking for you, you can either fork the µLea project or have a look at Lea, which has far richer functionalities and which is meant to evolve.

Language: Python 3.x

License: MIT

Author: Pierre Denis (pie.denis@skynet.be)

Installation

  • download the microlea.py file
  • start a python 3.x session in the same directory as microlea.py
  • type in the statements given in the examples below

You can find also all the examples in the microlea_examples.py file.

Examples

The examples below use the following aliases, which are simple Python assignments.

from microlea import *

# aliases defined for convenience
E = ElemPex
EB = ElemPex.bool
F = FuncPex
T = TuplePex
C = CondPex
B = TablePex
M = MixtPex
(t,f) = (True,False)

In the Python snippets below, line beginning with a # are comments and those beginning with # -> are the results of the execution of the line just above.

Elementary pex and functional pex (operators)

# fair die
da = E({1: 1/6, 2: 1/6, 3: 1/6, 4: 1/6, 5: 1/6, 6: 1/6})
print (da)
# -> {1: 0.16666666666666669, 2: 0.16666666666666669, 3: 0.16666666666666669, 4: 0.16666666666666669, 5: 0.16666666666666669, 6: 0.16666666666666669}
# note: thanks to condensation, this is equivalent to
# da = E({1: 1, 2: 1, 3: 1, 4: 1, 5: 1, 6: 1})
# comparison operators create a boolean distribution...
print (da == 1)
# -> {False: 0.8333333333333334, True: 0.16666666666666669}
# ...from which the p function extracts the probability of true
print (P(da == 1))
# -> 0.16666666666666669
# inequality operators are supported
print (P(da >= 4))
# -> 0.5000000000000001
print (P(da >= 7))
# -> 0.0
# binding in action: da occurrences in the same pex are consistent
print(da+da)
# -> {2: 0.16666666666666669, 4: 0.16666666666666669, 6: 0.16666666666666669, 8: 0.16666666666666669, 10: 0.16666666666666669, 12: 0.16666666666666669}
print(da-da)
# -> {0: 1.0}
# boolean operator AND is supported, with binding
print (P((2 <= da) & (da < 6)))
# -> 0.6666666666666667
# as well as OR
print (P((2 > da) | (da >= 6)))
# -> 0.33333333333333337
# and NOT (here we can check de Morgan's law)
print (P(~((2 > da) | (da >= 6))))
# -> 0.6666666666666667
# create a new die, by cloning
db = da.new()
# add the results of the two dice
dab = da + db
print (dab)
# modulo operator to get parity bit of each die and of two dice together
parity_da = da % 2
parity_db = db % 2
parity_dab = dab % 2
print (parity_dab)
# -> {0: 0.5, 1: 0.5}
# applying a user-defined function
def parityNameFunc(b):
    if b == 0:
        return 'even'
    return 'odd'
parityName_dab = F(parityNameFunc,parity_dab)
print (parityName_dab)
# -> {'odd': 0.5, 'even': 0.5}

Conditional pex

# conditional pex: die da given that we know that da < 3
print (C(da,da < 3))
# -> {1: 0.5, 2: 0.5}
# equivalent to this
print (da.given(da < 3))
# -> {1: 0.5, 2: 0.5}
# sum of dice given that we know that da < 3
print (dab.given(da < 3))
# -> {2: 0.08333333333333331, 3: 0.16666666666666663, 4: 0.16666666666666663, 5: 0.16666666666666663, 6: 0.16666666666666663, 7: 0.16666666666666663, 8: 0.08333333333333331}
# boolean AND: sum of dice given that we know that da < 3 AND db < 3
print (dab.given((da < 3) & (db < 3)))
# -> {2: 0.25, 3: 0.5, 4: 0.25}
# boolean OR: sum of dice given that we know that da < 3 OR db < 3
print (dab.given((da < 3) | (db < 3)))
# -> {2: 0.04999999999999999, 3: 0.09999999999999998, 4: 0.14999999999999997, 5: 0.19999999999999996, 6: 0.19999999999999996, 7: 0.19999999999999996, 8: 0.09999999999999998}
# conditonal pex can also derive input from evidence on output
print (da.given(dab < 4))
# -> {1: 0.6666666666666666, 2: 0.3333333333333333}
# tuple pex is useful to have explanation, by getting elementary events
print (T(da,db,dab).given(dab < 4))
# -> {(1, 1, 2): 0.3333333333333333, (2, 1, 3): 0.3333333333333333, (1, 2, 3): 0.3333333333333333}
print (T(da,db,dab).given((da < 3) & (db < 3)))
# -> {(1, 1, 2): 0.25, (2, 2, 4): 0.25, (2, 1, 3): 0.25, (1, 2, 3): 0.25}
print (parityName_dab.given(parity_da == parity_db))
# -> {'even': 1.0}
print (parityName_dab.given(parity_da != parity_db))
# -> {'odd': 1.0}
# conditional probability
print (P((dab > 4).given(da < 3)))
# -> 0.5833333333333333
# verification of conditional probability definition: P(A | B) = P(A & B) / P(B)
print (P((dab > 4) & (da < 3)) / P(da < 3))
# -> 0.5833333333333331
# verification of Bayes rule: P(A | B) = P(B | A) * P(A) / P(B)
print (P((da < 3).given(dab > 4)) * P(dab > 4) / P(da < 3))
# -> 0.5833333333333331
# same checks for the counterpart P(B | A)
print (P((da < 3).given(dab > 4)))
# -> 0.2333333333333333
print (P((da < 3) & (dab > 4)) / P(dab > 4))
# -> 0.23333333333333328
print (P((dab > 4).given(da < 3)) * P(da < 3) / P(dab > 4))
# -> 0.23333333333333334

Table pex and Bayesian networks

# prior probability
# same as rain = E({True:0.20,False:0.80})
rain = EB(0.20)
# sprinkler as a CPT depending of rain
sprinkler = B( rain, { True  : EB(0.01),
                       False : EB(0.40)} )
# grassWet as a CPT depending of sprinkler and rain
grassWet = B( T(sprinkler, rain ),
              { (False   , False) : False,
                (False   , True ) : EB(0.80),
                (True    , False) : EB(0.90),
                (True    , True ) : EB(0.99)} )
# check rain definition
print (P(rain))
# -> 0.2
# check sprinkler CPT definition
print (P(sprinkler.given(rain)))
# -> 0.01
print (P(sprinkler.given(~rain)))
# -> 0.4000000000000001
# check grassWet CPT definition
print (P(grassWet.given(~sprinkler & ~rain)))
# -> 0.0
print (P(grassWet.given(~sprinkler & rain)))
# -> 0.8
print (P(grassWet.given( sprinkler &~rain)))
# -> 0.9000000000000001
print (P(grassWet.given( sprinkler & rain)))
# -> 0.99
# unconditional probabilities
print (P(sprinkler))
# -> 0.32200000000000006
print (P(grassWet))
# -> 0.4483800000000001
print (P(grassWet.given(rain)))
# -> 0.8019000000000001
print (P(rain.given(grassWet)))
# -> 0.35768767563227616
print (P(rain.given(~grassWet)))
# -> 0.07182480693230847
# calculate full joint distribution using tuple pex (8 entries)
print(T(rain,sprinkler,grassWet))
# -> {(False, True, True): 0.28800000000000003, (False, False, False): 0.48, (True, True, False): 2.000000000000002e-05, (True, False, False): 0.039599999999999996, (False, True, False): 0.031999999999999994, (True, True, True): 0.0019800000000000004, (True, False, True): 0.1584}

Mixture pex and Bayesian networks

sprinkler2 = M( C(EB(0.01),  rain),
                C(EB(0.40), ~rain))
grassWet2 = M( C(False   , ~sprinkler2 & ~rain ),
               C(EB(0.80), ~sprinkler2 &  rain ),
               C(EB(0.90),  sprinkler2 & ~rain ),
               C(EB(0.99),  sprinkler2 &  rain ))
print (P(grassWet2.given(rain)))
# -> 0.8019000000000001
print (P(rain.given(grassWet2)))
# -> 0.35768767563227616
print (P(rain.given(~grassWet2)))
# -> 0.07182480693230847

Probabilities as fractions

from fractions import Fraction
u = Fraction(1)
df = E(dict((i,u) for i in range(1,7)))
print (df)
# -> {1: Fraction(1, 6), 2: Fraction(1, 6), 3: Fraction(1, 6), 4: Fraction(1, 6), 5: Fraction(1, 6), 6: Fraction(1, 6)}
dg = df.new()
dfg = df + dg
print (dfg)
# -> {2: Fraction(1, 36), 3: Fraction(1, 18), 4: Fraction(1, 12), 5: Fraction(1, 9), 6: Fraction(5, 36), 7: Fraction(1, 6), 8: Fraction(5, 36), 9: Fraction(1, 9), 10: Fraction(1, 12), 11: Fraction(1, 18), 12: Fraction(1, 36)}
rain3 = EB(u*20/100)
sprinkler3 = B( rain3, { True  : EB(u*1/100),
                         False : EB(u*40/100)} )
# grassWet as a CPT depending of sprinkler and rain
grassWet3 = B( T(sprinkler3, rain3 ),
              { (False     , False) : False,
                (False     , True ) : EB(u*80/100),
                (True      , False) : EB(u*90/100),
                (True      , True ) : EB(u*99/100)} )
print (P(grassWet3.given(rain3)))
# -> 8019/10000
print (float(P(grassWet3.given(rain3))))
# -> 0.8019
print (P(rain3.given(grassWet3)))
# -> 891/2491
print (float(P(rain3.given(grassWet3))))
# -> 0.3576876756322762
print (P(rain3.given(~grassWet3)))
# -> 1981/27581
print (float(P(rain3.given(~grassWet3))))
# -> 0.07182480693230847