Please add information-theoretic functions

Issue #1041 new
Former user created an issue

Some functions are widely used in Information Theory but they are cumbersome to calculate manually. Built-in functions may be quite useful. In particular, the following functions could be implemented: - ENTROPY(p1,p2,...,pn) returns the entropy of a probability distribution (optionally the base may be specified) - KL(p1,p2,...,pn; q1,q2,...,qn) returns the Kullback-Leibler divergence - CROSSENTROPY(p1,p2,...,pn; q1,q2,...,qn) returns the cross-entropy of the two distributions

Other functions may be useful, such as JOINT_ENTROPY or CONDITIONAL_ENTROPY or MUTUAL_INFORMATION but they require a joint probability distribution which may be cumbersome to represent in the calculator.

Comments (0)

  1. Log in to comment