cb07f5b
committed
Commits
Comments (0)
Files changed (7)

+2 0CHANGELOG.txt

+1 1README.md

+38 0code/H_I_D/base_estimators/DKL_kNN_k_estimation.m

+51 0code/H_I_D/base_estimators/DKL_kNN_k_initialization.m

+43 0code/H_I_D/base_estimators/DKL_kNN_kiTi_estimation.m

+47 0code/H_I_D/base_estimators/DKL_kNN_kiTi_initialization.m

+0 0doc/ITE_documentation.pdf
CHANGELOG.txt
+2 knearest neighbor based KullbackLeibler divergence estimators: added. See 'DKL_kNN_k_initialization.m', 'DKL_kNN_k_estimation.m', 'DKL_kNN_kiTi_initialization.m', 'DKL_kNN_kiTi_estimation.m'.
README.md
ITE can estimate Shannon, Rényi, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon, L2, Rényi, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa; complex variants of entropy and mutual information; L2, Rényi, Tsallis divergence; Hellinger, Bhattacharyya distance; maximum mean discrepancy, and Jdistance.
+ITE can estimate Shannon, Rényi, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon, L2, Rényi, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa; complex variants of entropy and mutual information; L2, Rényi, Tsallis, KullbackLeibler divergence; Hellinger, Bhattacharyya distance; maximum mean discrepancy, and Jdistance.
code/H_I_D/base_estimators/DKL_kNN_k_estimation.m
+%using the kNN method (S={k}). The number of samples in X [=size(X,2)] and Y [=size(Y,2)] can be different. Cost parameters are provided in the cost object co.
+%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
+% Fernando PerezCruz. Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems (NIPS), pp. 12571264, 2008.
+% Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):21532182, 2008.
+% Quing Wang, Sanjeev R. Kulkarni, and Sergio Verdu. Divergence estimation for multidimensional densities via knearestneighbor distances. IEEE Transactions on Information Theory, 55:23922405, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D/base_estimators/DKL_kNN_k_initialization.m
+%Initialization of the kNN (knearest neighbor, S={k}) based KullbackLeibler divergence estimator.
+% 2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+ %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
code/H_I_D/base_estimators/DKL_kNN_kiTi_estimation.m
+%using the kNN method (S={k}). The number of samples in X [=size(X,2)] and Y [=size(Y,2)] can be different. Cost parameters are provided in the cost object co.
+%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
+% Quing Wang, Sanjeev R. Kulkarni, and Sergio Verdu. Divergence estimation for multidimensional densities via knearestneighbor distances. IEEE Transactions on Information Theory, 55:23922405, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D/base_estimators/DKL_kNN_kiTi_initialization.m
+%Initialization of the kNN (knearest neighbor, S_1={k_1}, S_2={k_2}) based KullbackLeibler divergence estimator. Here, kis depend on the number of samples, they are set in 'DKullback_Leibler_kNN_kiTi_estimation.m'.
+% 2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+ %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
doc/ITE_documentation.pdf
Binary file modified.