Files changed (1)
+ITE is capable of estimating many different variants of entropy, mutual information and divergence measures. Thanks to its highly modular design, ITE supports additionally
+ITE can estimate Shannon-, Rényi entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, Rényi-, Tsallis mutual information, copula-based kernel dependency; complex variants of entropy and mutual information; L2-, Rényi, Tsallis divergence, maximum mean discrepancy, and J-distance.
+- its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed systems, as well as to systems