Commits
Comments (0)
Files changed (15)

+4 0CHANGELOG.txt

+3 3README.md

+27 12code/ITE_install.m

+38 0code/estimators/base_estimators/IShannon_AP2_estimation.m

+31 0code/estimators/base_estimators/IShannon_AP2_initialization.m

+38 0code/estimators/base_estimators/IShannon_AP_estimation.m

+31 0code/estimators/base_estimators/IShannon_AP_initialization.m

+1 0code/estimators/base_estimators/Kexpected_estimation.m

+4 2code/estimators/quick_tests/tests_analytical_vs_estimation/quick_test_IShannon.m

+3 0code/estimators/quick_tests/tests_image_registration/quick_test_Iimreg.m

+10 8code/estimators/quick_tests/tests_other_consistency/quick_test_Iindependence.m

+674 0code/shared/embedded/MI_AP/LICENSE.txt

+206 0code/shared/embedded/MI_AP/mutin.cpp

+54 0code/shared/embedded/MI_AP/mutin.m

+67 0code/shared/embedded/MI_AP/muting.m
README.md
 `divergence (D)`: KullbackLeibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence, J divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenRényi divergence, K divergence, L divergence, fdivergence (CsiszárMorimoto divergence, AliSilvey distance), nonsymmetric Bregman distance (Bregman divergence), JensenTsallis divergence, symmetric Bregman distance, Pearson chi square divergence (chi square distance), SharmaMittal divergence,
 `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
 `kernels on distributions (K)`: expected kernel (summation kernel, mean map kernel), Bhattacharyya kernel, probability product kernel, JensenShannon kernel, exponentiated JensenShannon kernel, exponentiated JensenRenyi kernel(s), JensenTsallis kernel, exponentiated JensenTsallis kernel(s), and
+ `kernels on distributions (K)`: expected kernel (summation kernel, mean map kernel, set kernel, multiinstance kernel, ensemble kernel; special convolution kernel), Bhattacharyya kernel, probability product kernel, JensenShannon kernel, exponentiated JensenShannon kernel, exponentiated JensenRenyi kernel(s), JensenTsallis kernel, exponentiated JensenTsallis kernel(s), and
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.58_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.58_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.59_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.59_code.tar.bz2),
code/ITE_install.m
%compile (if you do not want to use the embedded C/C++ accelerations, set all the 'compile_...' variables to zero: compile_NCut = 0; compile_ANN = 0; ...):
 compile_TCA = 1;%1=compile, 0=do not compile (chol_gauss); not necessary, but can further speedup the computations; the package also contains the purely Matlab/Octave 'chol_gauss.m'
 compile_SWICA = 1; %1=compile, 0=do not compile; not necessary, but can accelerate computations; the package also contains the purely Matlab/Octave 'SW_kappa.m' and 'SW_sigma.m'
 compile_Hoeffding_term1 = 1; %1=compile, 0=do not compile; not necessary, but can be more ecomical in terms of memory used + accelerate computations; the package also contains the purely Matlab/Octave 'Hoeffding_term1.m'
 compile_Edgeworth_t1_t2_t3 = 1; %1=compile, 0=do not compile; not necessary, but can accelerate computations; the ITE package also contains the purely Matlab/Octave 'Edgeworth_t1_t2_t3.m'
 compile_CDSS = 1; %1=compile, 0=do not compile; not necessary, but can speed up the computations; the ITE package also contains the purely Matlab/Octave 'compute_CDSS.m'
+ compile_TCA = 1;%1=compile, 0=do not compile (chol_gauss); not necessary, but can further speedup the computations; the package also contains the purely Matlab/Octave 'chol_gauss.m'; > KCCA/KGV estimator, incomplete Cholesky decomposition.
+ compile_SWICA = 1; %1=compile, 0=do not compile; not necessary, but can accelerate computations; the package also contains the purely Matlab/Octave 'SW_kappa.m' and 'SW_sigma.m'. > SchweizerWolff's sigma and kappa estimation.
+ compile_Hoeffding_term1 = 1; %1=compile, 0=do not compile; not necessary, but can be more ecomical in terms of memory used + accelerate computations; the package also contains the purely Matlab/Octave 'Hoeffding_term1.m'; > Hoeffding's Phi estimation.
+ compile_Edgeworth_t1_t2_t3 = 1; %1=compile, 0=do not compile; not necessary, but can accelerate computations; the ITE package also contains the purely Matlab/Octave 'Edgeworth_t1_t2_t3.m'; > Edgeworth expansion based entropy estimation.
+ compile_CDSS = 1; %1=compile, 0=do not compile; not necessary, but can speed up the computations; the ITE package also contains the purely Matlab/Octave 'compute_CDSS.m'; > a Renyi quadratic entropy estimator.
+ compile_KDP = 1; %1=compile, 0=do not compile; > adaptive partitioning based Shannon entropy estimation.
+ compile_MI_AP = 1; %1=compile, 0=do not compile; not necessary, but can speed up the computations; the ITE package also contains the purely Matlab/Octave 'mutin.m'. > mutual information estimation based on adaptive partitioning.
 download_MISR1 = 1;%1=download+extract,0=do not download, aerosol optical depth (AOD) prediction: MISR1 dataset
+ download_MISR1 = 1;%1=download+extract,0=do not download; > aerosol optical depth (AOD) prediction: MISR1 dataset
code/estimators/base_estimators/IShannon_AP2_estimation.m
+%Estimates the Shannon mutual information (I) using adaptive partitioning; specialization to 2 components.
+%We use the naming convention 'I<name>_estimation' to ease embedding new mutual information estimation methods.
+% Georges A. Darbellay and Petr Tichavsky. Independent component analysis through direct estimation of the mutual information. In International Workshop on Independent Component Analysis and Blind Signal Separation, pages 6974, 2000.
+% Georges A. Darbellay and Igor Vajda. Estimation of the information by an adaptive partitioning of the observation space. IEEE Transactions on Information Theory, 45:13151321, 1999.
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0]. In other words, the estimation is carried out 'exactly' (instead of up to 'proportionality').
code/estimators/base_estimators/IShannon_AP2_initialization.m
+%Initialization of the adaptive partitioning based Shannon mutual information estimator; specialization to 2 components.
+% 2)We use the naming convention 'I<name>_initialization' to ease embedding new mutual information estimation methods.
+% mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes (='exact' estimation), '=0' no (=estimation up to 'proportionality').
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
code/estimators/base_estimators/IShannon_AP_estimation.m
+%We use the naming convention 'I<name>_estimation' to ease embedding new mutual information estimation methods.
+% Georges A. Darbellay and Petr Tichavsky. Independent component analysis through direct estimation of the mutual information. In International Workshop on Independent Component Analysis and Blind Signal Separation, pages 6974, 2000.
+% Georges A. Darbellay and Igor Vajda. Estimation of the information by an adaptive partitioning of the observation space. IEEE Transactions on Information Theory, 45:13151321, 1999.
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0]. In other words, the estimation is carried out 'exactly' (instead of up to 'proportionality').
code/estimators/base_estimators/IShannon_AP_initialization.m
+% 2)We use the naming convention 'I<name>_initialization' to ease embedding new mutual information estimation methods.
+% mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes (='exact' estimation), '=0' no (=estimation up to 'proportionality').
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
code/estimators/base_estimators/Kexpected_estimation.m
% Krikamol Muandet, Kenji Fukumizu, Francesco Dinuzzo, and Bernhard Scholkopf. Learning from distributions via support measure machines. In Advances in Neural Information Processing Systems (NIPS), pages 1018, 2011.
% Alain Berlinet and Christine ThomasAgnan. Reproducing Kernel Hilbert Spaces in Probability and Statistics. Kluwer, 2004. (mean embedding)
% Thomas Gartner, Peter A. Flach, Adam Kowalczyk, and Alexander Smola. Multiinstance kernels. In International Conference on Machine Learning (ICML), pages 179–186, 2002. (multiinstance/set/ensemble kernel)
+% David Haussler. Convolution kernels on discrete structures Technical report, Department of Computer Science, University of California at Santa Cruz, 1999. (convolution kernel spec> set kernel)
%Copyright (C) 20122014 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
code/estimators/quick_tests/tests_analytical_vs_estimation/quick_test_IShannon.m
ds = [2;2]; %subspace dimensions. ds(m) = dimension of the m^th subspace, m=1,...,M (M=length(ds)); M>=2
CHANGELOG.txt