Comments (0)
Files changed (15)

+8 0CHANGELOG.txt

+3 3README.md

+5 0code/H_I_D_A_C/meta_estimators/DEnergyDist_DMMD_estimation.m

+5 0code/H_I_D_A_C/meta_estimators/DJdistance_estimation.m

+5 0code/H_I_D_A_C/meta_estimators/DJensenRenyi_HRenyi_estimation.m

+5 0code/H_I_D_A_C/meta_estimators/DJensenShannon_HShannon_estimation.m

+5 0code/H_I_D_A_C/meta_estimators/DKL_CCE_HShannon_estimation.m

+46 0code/H_I_D_A_C/meta_estimators/DK_DKL_estimation.m

+32 0code/H_I_D_A_C/meta_estimators/DK_DKL_initialization.m

+46 0code/H_I_D_A_C/meta_estimators/DL_DKL_estimation.m

+32 0code/H_I_D_A_C/meta_estimators/DL_DKL_initialization.m

+5 0code/H_I_D_A_C/meta_estimators/ITsallis_DTsallis_estimation.m

+14 5code/H_I_D_A_C/utilities/estimate_Dtemp1.m

+13 5code/H_I_D_A_C/utilities/estimate_Dtemp2.m

+10 4code/H_I_D_A_C/utilities/kNN_squared_distances.m
CHANGELOG.txt
+(i) meta estimators, see 'ITsallis_DTsallis_estimation.m', 'DEnergyDist_DMMD_estimation.m', 'DJdistance_estimation.m', 'DJensenRenyi_HRenyi_estimation.m', 'DJensenShannon_HShannon_estimation.m', 'DKL_CCE_HShannon_estimation.m'.
JensenRenyi divergence estimation: added; see 'DJensenRenyi_HRenyi_initialization.m' and 'DJensenRenyi_HRenyi_estimation.m'.
JensenShannon divergence estimation: added; see 'DJensenShannon_HShannon_initialization.m' and 'DJensenShannon_HShannon_estimation.m'.
README.md
 `entropy (H)`: Shannon entropy, R�nyi entropy, Tsallis entropy (Havrda and Charv�t entropy), complex entropy,
 `mutual information (I)`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon mutual information, L2 mutual information, R�nyi mutual information, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa, complex mutual information, CauchySchwartz quadratic mutual information, Euclidean distance based quadratic mutual information, distance covariance, distance correlation, approximate correntropy independence measure,
 `divergence (D)`: KullbackLeibler divergence (relative entropy; I directed divergence), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance, an integral probability metric), Jdistance (symmetrised KullbackLeibler divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenR�nyi divergence,
+ `divergence (D)`: KullbackLeibler divergence (relative entropy, I directed divergence), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance, an integral probability metric), Jdistance (symmetrised KullbackLeibler divergence, J divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenR�nyi divergence, K divergence, L divergence,
 `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.36_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.36_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.37_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.37_code.tar.bz2),
code/H_I_D_A_C/meta_estimators/DJensenRenyi_HRenyi_estimation.m
D_JR = H_estimation(mixtureY,co.member_co)  (w(1)*H_estimation(Y1,co.member_co) + w(2)*H_estimation(Y2,co.member_co));
code/H_I_D_A_C/meta_estimators/DJensenShannon_HShannon_estimation.m
D_JS = H_estimation(mixtureY,co.member_co)  (w(1)*H_estimation(Y1,co.member_co) + w(2)*H_estimation(Y2,co.member_co));
code/H_I_D_A_C/meta_estimators/DK_DKL_estimation.m
+%Estimates the K divergence of Y1 and Y2 using the relation: D_K(f_1,f_2) = D(f_1,(f_1+f_2)/2), where D denotes the KullbackLeibler divergence.
+% 1)We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+% Jianhua Lin. Divergence measures based on the Shannon entropy. IEEE Transactions on Information Theory, 37:145151, 1991.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/meta_estimators/DK_DKL_initialization.m
+%Initialization of the K divergence estimator, defined according to the relation: D_K(f_1,f_2) = D(f_1,(f_1+f_2)/2), where D denotes the KullbackLeibler divergence.
+% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/meta_estimators/DL_DKL_estimation.m
+%Estimates the L divergence of Y1 and Y2 using the relation: D_L(f_1,f_2) = D(f_1,(f_1+f_2)/2) + D(f_2,(f_1+f_2)/2), where D denotes the KullbackLeibler divergence.
+% 1)We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+% Jianhua Lin. Divergence measures based on the Shannon entropy. IEEE Transactions on Information Theory, 37:145151, 1991.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/meta_estimators/DL_DKL_initialization.m
+%Initialization of the L divergence estimator, defined according to the relation: D_L(f_1,f_2) = D(f_1,(f_1+f_2)/2) + D(f_2,(f_1+f_2)/2), where D denotes the KullbackLeibler divergence.
+% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/utilities/estimate_Dtemp1.m
%Estimates Dtemp1 = \int p^{\alpha}(x)q^{1\alpha}(x)dx, the Renyi and the Tsallis divergences are simple functions of this quantity.
+%Estimates Dtemp1 = \int p^{\alpha}(u)q^{1\alpha}(u)du, the Renyi and the Tsallis divergences are simple functions of this quantity.
%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/utilities/estimate_Dtemp2.m
%Estimates Dtemp2 = \int p^a(x)q^b(x)dx; the Hellinger distance and the Bhattacharyya distance are simple functions of this quantity.
+%Estimates Dtemp2 = \int p^a(u)q^b(u)p(u)du; the Hellinger distance and the Bhattacharyya distance are simple functions of this quantity.
%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/utilities/kNN_squared_distances.m
[squared_distances,indices] = knn(Q, Y, max(co.k)+1);%assumption below:max(co.k)+1 <= size(Q,1)[=size(Y,1)]
 squared_distances = (distances.').^2;%distances > squared distances; .': to be compatible with 'ANN'
+ squared_distances = (distances(:,2:end).').^2;%distances > squared distances; .': to be compatible with 'ANN'
+ squared_distances = (distances.').^2;%distances > squared distances; .': to be compatible with 'ANN'