Comments (0)
Files changed (7)

+4 0CHANGELOG.txt

+3 3README.md

+38 0code/H_I_D_A_C_K/base_estimators/KPP_kNN_k_estimation.m

+63 0code/H_I_D_A_C_K/base_estimators/KPP_kNN_k_initialization.m

+36 0code/H_I_D_A_C_K/meta_estimators/KJS_DJS_estimation.m

+32 0code/H_I_D_A_C_K/meta_estimators/KJS_DJS_initialization.m

+1 0code/shared/embedded/KDP/mat_oct/mexme.m
CHANGELOG.txt
+Probability product kernel estimation based on knearest neighbors: added; see 'KPP_kNN_k_initialization.m' and 'KPP_kNN_k_estimation.m'.
+JensenShannon kernel estimation: added; see 'KJS_DJS_initialization.m' and 'KJS_DJS_estimation.m'.
Bhattacharyya kernel estimation based on knearest neighbors: added; see 'KBhattacharyya_kNN_k_initialization.m' and 'KBhattacharyya_kNN_k_estimation.m'.
README.md
 `divergence (D)`: KullbackLeibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence, J divergence), CauchySchwartz divergence, Euclidean distance based divergence, energy distance (specially the CramerVon Mises distance), JensenShannon divergence, JensenRényi divergence, K divergence, L divergence, certain fdivergences (CsiszárMorimoto divergence, AliSilvey distance), nonsymmetric Bregman distance (Bregman divergence), JensenTsallis divergence, symmetric Bregman distance,
 `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
+ `kernels on distributions (K)`: expected kernel, Bhattacharyya kernel, probability product kernel, JensenShannon kernel.
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.40_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.40_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.41_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.410_code.tar.bz2),
code/H_I_D_A_C_K/base_estimators/KPP_kNN_k_estimation.m
+%Estimates the probability product kernel of two distributions from which we have samples, Y1 and Y2. The estimation is based on knearest neighbors (S={k}).
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (knearest neighbor based estimation)
+% Tony Jebara, Risi Kondor, and Andrew Howard. Probability product kernels. Journal of Machine Learning Research, 5:819844, 2004. (probability product kernels spec> Bhattacharyya kernel)
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C_K/base_estimators/KPP_kNN_k_initialization.m
+%Initialization of the probability product kernel estimator. The estimation is based on knearest neighbors (S={k}).
+% 2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+ %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
code/H_I_D_A_C_K/meta_estimators/KJS_DJS_estimation.m
+%Estimates the JensenShannon kernel of two distributions from which we have samples (Y1 and Y2) using the relation: K_JS(f_1,f_2) = log(2)  D_JS(f_1,f_2), where D_JS is the JensenShannon divergence.
+% 1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935975, 2009.
+% Andre F. T. Martins, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Tsallis kernels on measures. In Information Theory Workshop (ITW), pages 298302, 2008.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C_K/meta_estimators/KJS_DJS_initialization.m
+%Initialization of the JensenShannon kernel estimator defined according to the relation: K_JS(f_1,f_2) = log(2)  D_JS(f_1,f_2), where D_JS is the JensenShannon divergence.
+% 2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ co.member_name = 'JensenShannon_HShannon'; %you can change it to any JensenShannon divergence estimator