94449cf
committed
Commits
Comments (0)
Files changed (8)

+11 5CHANGELOG.txt

+2 2README.md

+37 0code/estimators/meta_estimators/KEJS_DJS_estimation.m

+45 0code/estimators/meta_estimators/KEJS_DJS_initialization.m

+1 1code/estimators/meta_estimators/KJS_DJS_initialization.m

+47 0code/estimators/meta_estimators/KJT_HJT_estimation.m

+46 0code/estimators/meta_estimators/KJT_HJT_initialization.m

+2 0code/estimators/quick_tests/quick_test_Kpos_semidef.m
CHANGELOG.txt
+Exponentiated JensenShannon kernel estimation: added; see 'KEJS_DJS_initialization.m', 'KEJS_DJS_estimation.m'.
ITE_install.m: made to be more userfriendly; detection of the already (i) deleted 'ann_wrapperO'/'ann_wrapperM' directory, (ii) downloaded ARfit package: added.
Probability product kernel estimation based on knearest neighbors: added; see 'KPP_kNN_k_initialization.m' and 'KPP_kNN_k_estimation.m'.
JensenShannon kernel estimation: added; see 'KJS_DJS_initialization.m' and 'KJS_DJS_estimation.m'.
Bhattacharyya kernel estimation based on knearest neighbors: added; see 'KBhattacharyya_kNN_k_initialization.m' and 'KBhattacharyya_kNN_k_estimation.m'.
Symmetric Bregman distance estimation based on nonsymmetric Bregman distance: added; see 'DsymBregman_DBregman_initialization.m', 'DsymBregman_DBregman_estimation.m'.
Symmetric Bregman distance estimation based on knearest neighbors: added; see 'DsymBregman_kNN_k_initialization.m', 'DsymBregman_kNN_k_estimation.m'.
JensenTsallis divergence estimation: added; see 'DJensenTsallis_HTsallis_initialization.m' and 'DJensenTsallis_HTsallis_estimation.m'.
Bregman distance estimation: added; see 'DBregman_kNN_k_initialization.m' and 'DBregman_kNN_k_estimation.m'.
README.md
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.42_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.42_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.43_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.43_code.tar.bz2),
code/estimators/meta_estimators/KEJS_DJS_estimation.m
+%Estimates the exponentiated JensenShannon kernel of two distributions from which we have samples (Y1 and Y2) using the relation: K_JS(f_1,f_2) = exp[u x D_JS(f_1,f_2)], where D_JS is the JensenShannon divergence, u>0.
+% 1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935975, 2009.
+% Andre F. T. Martins, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Tsallis kernels on measures. In Information Theory Workshop (ITW), pages 298302, 2008.
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
code/estimators/meta_estimators/KEJS_DJS_initialization.m
+%Initialization of the exponentiated JensenShannon kernel estimator defined according to the relation: K_JS(f_1,f_2) = exp[u x D_JS(f_1,f_2)], where D_JS is the JensenShannon divergence, u>0.
+% 2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+ co.member_name = 'JensenShannon_HShannon'; %you can change it to any JensenShannon divergence estimator
code/estimators/meta_estimators/KJS_DJS_initialization.m
 co.member_co = D_initialization(co.member_name,mult,{'w',[1/2,1/2]}); %{'w',[1/2 1/2]}: the 'w' weight field of the member is also set here; note: if the JensenShannon divergence estimator is restricted to w=[1/2, 1/2], then of course you can leave '{...}'.
+ co.member_co = D_initialization(co.member_name,mult,{'w',[1/2,1/2]}); %{'w',[1/2 1/2]}: the 'w' weight field of the member is also set here; note: if the JensenShannon divergence estimator is restricted to w=[1/2, 1/2], then of course '{...}' can be discarded.
code/estimators/meta_estimators/KJT_HJT_estimation.m
+%Estimates the JensenTsallis kernel of two distributions from which we have samples (Y1 and Y2) using the relation: K_JT(f_1,f_2) = log_{alpha}(2)  T_alpha(f_1,f_2), where (i) log_{alpha} is the alphalogarithm, (ii) T_alpha is the JensenTsallis alphadifference (that can be expressed in terms of the Tsallis entropy).
+% 1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935975, 2009.
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+ T = H_estimation(mixtureY,co.member_co)  (w(1)^a * H_estimation(Y1,co.member_co) + w(2)^a * H_estimation(Y2,co.member_co));
code/estimators/meta_estimators/KJT_HJT_initialization.m
+%Initialization of the JensenTsallis kernel estimator defined according to the relation: K_JT(f_1,f_2) = log_{alpha}(2)  T_alpha(f_1,f_2), where (i) log_{alpha} is the alphalogarithm, (ii) T_alpha is the JensenTsallis alphadifference (that can be expressed in terms of the Tsallis entropy).
+% 2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+% post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+ co.member_co = H_initialization(co.member_name,mult,{'alpha',co.alpha}); %{'alpha',co.alpha}: the 'alpha' field of the member is also set here