Commits
Comments (0)
Files changed (7)

+2 0CHANGELOG.txt

+6 6README.md

+61 0code/H_I_D_A_C/base_estimators/DMMD_online_estimation.m

+30 0code/H_I_D_A_C/base_estimators/DMMD_online_initialization.m

+0 61code/H_I_D_A_C/base_estimators/DMMDonline_estimation.m

+0 30code/H_I_D_A_C/base_estimators/DMMDonline_initialization.m

+1 1code/H_I_D_A_C/meta_estimators/IMMD_DMMD_initialization.m
README.md
 `entropy`: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy,
 `mutual information`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon mutual information, L2 mutual information, Rényi mutual information, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa, complex mutual information, CauchySchwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
 `divergence`: KullbackLeibler divergence (relative entropy), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence), CauchySchwartz divergence, Euclidean distance based divergence,
 `association measures`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient),
+ `entropy (H)`: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy,
+ `mutual information (I)`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon mutual information, L2 mutual information, Rényi mutual information, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa, complex mutual information, CauchySchwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
+ `divergence (D)`: KullbackLeibler divergence (relative entropy), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), Jdistance (symmetrised KullbackLeibler divergence), CauchySchwartz divergence, Euclidean distance based divergence,
+ `association measures (A)`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient),
 if you have an entropy, mutual information, divergence estimator/subtask solver with a GPLv3(>=)compatible license that you would like to be embedded into ITE, feel free to contact me.
+ if you have an H/I/D/A/C estimator/subtask solver with a GPLv3(>=)compatible license that you would like to be embedded into ITE, feel free to contact me.
code/H_I_D_A_C/base_estimators/DMMD_online_estimation.m
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken.
+% Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel TwoSample Test. Journal of Machine Learning Research 13 (2012) 723773. See Lemma 14.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
code/H_I_D_A_C/base_estimators/DMMD_online_initialization.m
+% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/base_estimators/DMMDonline_estimation.m
%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken.
% Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel TwoSample Test. Journal of Machine Learning Research 13 (2012) 723773. See Lemma 14.
%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
code/H_I_D_A_C/base_estimators/DMMDonline_initialization.m
% 2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
CHANGELOG.txt