107f79f
committed
Commits
Comments (0)
Files changed (9)

+4 0CHANGELOG.txt

+3 3README.md

+41 0code/estimators/meta_estimators/KEJR1_HR_estimation.m

+47 0code/estimators/meta_estimators/KEJR1_HR_initialization.m

+37 0code/estimators/meta_estimators/KEJR2_DJR_estimation.m

+47 0code/estimators/meta_estimators/KEJR2_DJR_initialization.m

+66 0code/estimators/quick_tests/quick_test_KEJR1.m

+70 0code/estimators/quick_tests/quick_test_KEJR2.m

+2 0code/estimators/quick_tests/quick_test_Kpos_semidef.m
CHANGELOG.txt
+Exponentiated JensenRenyi kernel estimator based on Renyi entropy and JensenRenyi divergence: added; see 'KEJR1_HR_initialization.m', 'KEJR1_HR_estimation.m', 'KEJR2_DJR_initialization.m', 'KEJR2_DJR_estimation.m'.
+Quick tests for the exponentiated JensenRenyi kernel estimators: added, see 'quick_test_EJR1.m', 'quick_test_KEJR2.m'. 'quick_test_Kpos_semidef.m': changed to cover the 2 new distribution kernel estimators.
Exponentiated JensenShannon kernel estimation: added; see 'KEJS_DJS_initialization.m', 'KEJS_DJS_estimation.m'.
README.md
 `divergence (D)`: KullbackLeibler divergence (relative entropy, I directed divergence), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence, J divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenR�nyi divergence, K divergence, L divergence, certain fdivergences (Csisz�rMorimoto divergence, AliSilvey distance), nonsymmetric Bregman distance (Bregman divergence), JensenTsallis divergence, symmetric Bregman distance,
 `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
 `kernels on distributions (K)`: expected kernel, Bhattacharyya kernel, probability product kernel, JensenShannon kernel, exponentiated JensenShannon kernel, JensenTsallis kernel.
+ `kernels on distributions (K)`: expected kernel, Bhattacharyya kernel, probability product kernel, JensenShannon kernel, exponentiated JensenShannon kernel, JensenTsallis kernel, exponentiated JensenRenyi kernel.
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.43_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.43_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.44_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.44_code.tar.bz2),
code/estimators/meta_estimators/KEJR1_HR_estimation.m
+%Estimates the exponentiated JensenRenyi kernel1 of two distributions from which we have samples (Y1 and Y2) using the relation:
+%K_EJR1(f_1,f_2) = exp[u x H_R((y^1+y^2)/2)], where H_R is the Renyi entropy, (y^1+y^2)/2 is the mixture of y^1~f_1 and y^2~f_2 with 1/21/2 weights, u>0.
+% 1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935975, 2009.
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
code/estimators/meta_estimators/KEJR1_HR_initialization.m
+%Initialization of the exponentiated JensenRenyi kernel1 estimator defined according to the relation:
+%K_EJR1(f_1,f_2) = exp[u x H_R((y^1+y^2)/2)], where H_R is the Renyi entropy, (y^1+y^2)/2 is the mixture of y^1~f_1 and y^2~f_2 with 1/21/2 weights, u>0.
+% 2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ co.member_co = H_initialization(co.member_name,mult,{'alpha',co.alpha}); %{'alpha',co.alpha}: the 'alpha' field of the member is also set here
code/estimators/meta_estimators/KEJR2_DJR_estimation.m
+%Estimates the exponentiated JensenRenyi kernel2 of two distributions from which we have samples (Y1 and Y2) using the relation:
+% 1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935975, 2009.
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
code/estimators/meta_estimators/KEJR2_DJR_initialization.m
+%Initialization of the exponentiated JensenRenyi kernel2 estimator defined according to the relation:
+% 2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ co.member_name = 'JensenRenyi_HRenyi'; %you can change it to any JensenRenyi divergence estimator
+ co.member_co = D_initialization(co.member_name,mult,{'alpha',co.alpha,'w',[1/2,1/2]}); %{'alpha',co.alpha}: the 'alpha' and 'w' fields of the member are also set here. if the JensenShannon divergence estimator is restricted to w=[1/2, 1/2], then of course '{'w',[1/2,1/2]}' can be discarded.
code/estimators/quick_tests/quick_test_KEJR1.m
+%Quick test for JensenRenyi kernel1 estimators: analytical expression vs estimated value as a function of the sample number. In the test, normal variables are considered.
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ alpha_K = 2; %parameter of the JensenRenyi kernel; fixed; for alpha_K =2 we have explicit formula for the Renyientropy, and hence to the JensenRenyi kernel.
+ co = K_initialization(cost_name,1,{'alpha',alpha_K,'u',u_K}); %{'alpha',alpha_K,'u',u_K}: set the 'alpha' field
+ %analytical value of Renyi entropy (ref.: Fei Wang, Tanveer SyedaMahmood, Baba C. Vemuri, David Beymer, and Anand Rangarajan. ClosedForm JensenRenyi Divergence for Mixture of Gaussians and Applications to GroupWise Shape Registration. Medical Image Computing and ComputerAssisted Intervention, 12: 648–655, 2009.)
code/estimators/quick_tests/quick_test_KEJR2.m
+%Quick test for exponentiated JensenRenyi kernel2 estimators: analytical expression vs estimated value as a function of the sample number. In the test, normal variables are considered.
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ co = K_initialization(cost_name,1,{'alpha',2,'u',u}); %{'alpha',2,'u',u}:set the 'alpha' and 'u' fields. Note: there exist explicit formula in case of alpha = 2 for JensenRenyi divergence => to the JensenRenyi kernel too.
+ %analytical value of JensenRenyi divergence (ref.: Fei Wang, Tanveer SyedaMahmood, Baba C. Vemuri, David Beymer, and Anand Rangarajan. ClosedForm JensenRenyi Divergence for Mixture of Gaussians and Applications to GroupWise Shape Registration. Medical Image Computing and ComputerAssisted Intervention, 12: 648–655, 2009.)