Commits
Comments (0)
Files changed (12)

+7 0CHANGELOG.txt

+5 5README.md

+2 1code/ITE_install.m

+38 0code/estimators/meta_estimators/Df_DChiSquare_estimation.m

+45 0code/estimators/meta_estimators/Df_DChiSquare_initialization.m

+34 0code/estimators/meta_estimators/IShannon_DKL_estimation.m

+45 0code/estimators/meta_estimators/IShannon_DKL_initialization.m

+2 1code/estimators/quick_tests/quick_test_Dequality.m

+0 1code/estimators/quick_tests/quick_test_HShannon.m

+2 1code/estimators/quick_tests/quick_test_IShannon.m

+1 0code/estimators/quick_tests/quick_test_Iimreg.m

+2 1code/estimators/quick_tests/quick_test_Iindependence.m
README.md
 `entropy (H)`: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy, Phientropy (fentropy), SharmaMittal entropy,
 `mutual information (I)`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon mutual information (total correlation, multiinformation), L2 mutual information, Rényi mutual information, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa, complex mutual information, CauchySchwartz quadratic mutual information, Euclidean distance based quadratic mutual information, distance covariance, distance correlation, approximate correntropy independence measure, chisquare mutual information (HilbertSchmidt norm of the normalized crosscovariance operator, squaredloss mutual information, mean square contingency),
 `divergence (D)`: KullbackLeibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence, J divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenRényi divergence, K divergence, L divergence, certain fdivergences (CsiszárMorimoto divergence, AliSilvey distance), nonsymmetric Bregman distance (Bregman divergence), JensenTsallis divergence, symmetric Bregman distance, Pearson chi square divergence (chi square distance), SharmaMittal divergence,
+ `divergence (D)`: KullbackLeibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence, J divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenRényi divergence, K divergence, L divergence, fdivergence (CsiszárMorimoto divergence, AliSilvey distance), nonsymmetric Bregman distance (Bregman divergence), JensenTsallis divergence, symmetric Bregman distance, Pearson chi square divergence (chi square distance), SharmaMittal divergence,
 `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
 `kernels on distributions (K)`: expected kernel, Bhattacharyya kernel, probability product kernel, JensenShannon kernel, exponentiated JensenShannon kernel, exponentiated JensenRenyi kernel(s), JensenTsallis kernel, exponentiated JensenTsallis kernel(s), and
+ `kernels on distributions (K)`: expected kernel (summation kernel, mean map kernel), Bhattacharyya kernel, probability product kernel, JensenShannon kernel, exponentiated JensenShannon kernel, exponentiated JensenRenyi kernel(s), JensenTsallis kernel, exponentiated JensenTsallis kernel(s), and
**Follow ITE**: on [Bitbucket](https://bitbucket.org/szzoli/ite/follow), on [Twitter](https://twitter.com/ITEtoolbox)to be always uptodate.
+**Follow ITE**: on [Bitbucket](https://bitbucket.org/szzoli/ite/follow), on [Twitter](https://twitter.com/ITEtoolbox).
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.52_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.52_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.53_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.53_code.tar.bz2),
code/ITE_install.m
code/estimators/meta_estimators/Df_DChiSquare_estimation.m
+%Estimates the fdivergence (D) of Y1 and Y2 using secondorder Taylor expansion of f and Pearson chi square divergence.
+% 1)We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Frank Nielsen and Richard Nock. On the chi square and higherorder chi distances for approximating fdivergences. IEEE Signal Processing Letters, 2:1013, 2014.
+% Neil S. Barnett, Pietro Cerone, Sever Silvestru Dragomir, and A. Sofo. Approximating Csiszar fdivergence by the use of Taylor's formula with integral remainder. Mathematical Inequalities and Applications, 5:417432, 2002.
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0]. In other words, the estimation is carried out 'exactly' (instead of up to 'proportionality').
code/estimators/meta_estimators/Df_DChiSquare_initialization.m
+%Initialization of the secondorder Taylor expansion and Pearson chi square divergence based fdivergence estimator.
+% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+% mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes (='exact' estimation), '=0' no (=estimation up to 'proportionality').
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+ co.member_name = 'ChiSquare_kNN_k'; %you can change it to any Pearson chi square divergence estimator
code/estimators/meta_estimators/IShannon_DKL_estimation.m
+%Estimates Shannon mutual information (I) based on KullbackLeibler divergence. The estimation is carried out according to the relation: I(y^1,...,y^M) = D(f_y,\prod_{m=1}^M f_{y^m}).
+% 1)We use the naming convention 'I<name>_estimation' to ease embedding new mutual information estimation methods.
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0]. In other words, the estimation is carried out 'exactly' (instead of up to 'proportionality').
code/estimators/meta_estimators/IShannon_DKL_initialization.m
+%Initialization of the "meta" Shannon mutual information estimator based on KullbackLeibler divergence.
+%Mutual information is estimated using the relation: I(y^1,...,y^M) = D(f_y,\prod_{m=1}^M f_{y^m}).
+% 2)We use the naming convention 'I<name>_initialization' to ease embedding new mutual information estimation methods.
+% mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes (='exact' estimation), '=0' no (=estimation up to 'proportionality').
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
code/estimators/quick_tests/quick_test_Dequality.m
code/estimators/quick_tests/quick_test_IShannon.m
 ds = [1;1]; %subspace dimensions. ds(m) = dimension of the m^th subspace, m=1,...,M (M=length(ds)); M>=2
+ ds = [2;2]; %subspace dimensions. ds(m) = dimension of the m^th subspace, m=1,...,M (M=length(ds)); M>=2
CHANGELOG.txt