Commits
Comments (0)
Files changed (8)

+5 1CHANGELOG.txt

+4 4README.md

+59 0code/H_I_D_A_C/base_estimators/DMMD_Ustat_estimation.m

+35 0code/H_I_D_A_C/base_estimators/DMMD_Ustat_initialization.m

+65 0code/H_I_D_A_C/base_estimators/DMMD_Vstat_estimation.m

+35 0code/H_I_D_A_C/base_estimators/DMMD_Vstat_initialization.m

+16 3code/H_I_D_A_C/base_estimators/DMMD_online_estimation.m

+7 2code/H_I_D_A_C/base_estimators/DMMD_online_initialization.m
CHANGELOG.txt
'MMDonline' renamed to 'MMD_online'; see 'DMMD_online_initialization.m', 'DMMD_online_estimation.m'; 'IMMD_DMMD_initialization.m': modified accordingly.
+MMD estimation based on U and Vstatistics: added; see 'DMMD_Ustat_initialization.m', 'DMMD_Ustat_estimation.m', 'DMMD_Vstat_initialization.m', 'DMMD_Vstat_estimation.m'.
+Online MMD estimation: now RBF and linear kernels are both available; see 'DMMD_online_initialization.m', 'DMMD_online_estimation.m'.
+'MMDonline' renamed to 'MMD_online'; see 'DMMD_online_initialization.m', 'DMMD_online_estimation.m'; 'IMMD_DMMD_initialization.m': modified accordingly.
Three multivariate extensions of Spearman's rho: added; see 'ASpearman1_initialization.m', 'ASpearman1_estimation.m', 'ASpearman2_initialization.m', 'ASpearman2_estimation.m', 'ASpearman3_initialization.m', 'ASpearman3_estimation.m'.
README.md
 `entropy (H)`: Shannon entropy, R�nyi entropy, Tsallis entropy (Havrda and Charv�t entropy), complex entropy,
 `mutual information (I)`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon mutual information, L2 mutual information, R�nyi mutual information, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa, complex mutual information, CauchySchwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
 `divergence (D)`: KullbackLeibler divergence (relative entropy), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence), CauchySchwartz divergence, Euclidean distance based divergence,
 `association measures (A)`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient),
+ `divergence (D)`: KullbackLeibler divergence (relative entropy), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance, an integral probability metric), Jdistance (symmetrised KullbackLeibler divergence), CauchySchwartz divergence, Euclidean distance based divergence,
+ `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient),
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.23_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.23_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.24_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.24_code.tar.bz2),
code/H_I_D_A_C/base_estimators/DMMD_Ustat_estimation.m
+%Estimates divergence (D) of Y1 and Y2 using the MMD (maximum mean discrepancy) method, applying Vstatistics.
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel TwoSample Test. Journal of Machine Learning Research 13 (2012) 723773.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/base_estimators/DMMD_Ustat_initialization.m
+% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/base_estimators/DMMD_Vstat_estimation.m
+%Estimates divergence (D) of Y1 and Y2 using the MMD (maximum mean discrepancy) method, applying Vstatistics.
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+% Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel TwoSample Test. Journal of Machine Learning Research 13 (2012) 723773.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/base_estimators/DMMD_Vstat_initialization.m
+% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/base_estimators/DMMD_online_estimation.m
+ D = (K_RBF(Y1i,Y1j,co) + K_RBF(Y2i,Y2j,co)  K_RBF(Y1i,Y2j,co)  K_RBF(Y1j,Y2i,co)) / (num_of_samples/2);
+ D = (K_linear(Y1i,Y1j) + K_linear(Y2i,Y2j)  K_linear(Y1i,Y2j)  K_linear(Y1j,Y2i)) / (num_of_samples/2);