Mutual information of independent variables expected to be exacly 0.0

Issue #64 resolved
Pierre Denis repo owner created an issue
(die1,die2) = lea.interval(1,6).new(2)
lea.mutual_information(die1,die2)
# -> 2.6645352591003757e-15

The expected value is 0.0 because die1 and die2 are independent. However, due to numerical representation with float, the returned value is not 0.0, although very small.

Similarily, .

die1.cond_entropy(die2)
# -> 2.5849625007211534
die1.entropy
# -> 2.584962500721156

The conditional entropy between two independent variables is expected to be exaclty equal to the entropy of the first.

The issue occurs also when using symbolic computation, producing long formulas that, although correct, fail to be simplified.

The independence could be easily detected, so as to force the exact expected value explicitely in these two methods.

Comments (8)

  1. Pierre Denis reporter

    Correct get_alea_leaves_set (renamed get_leaves_set, now returning Alea, Olea, Plea), remove *ceiling arguments + misc refactoring (refs #64)

    → <<cset 02d603c748b0>>

  2. Log in to comment