Commits

Zoltan Szabo committed c2c43aa

'MMDonline' renamed to 'MMD_online'; see 'DMMD_online_initialization.m', 'DMMD_online_estimation.m'; 'IMMD_DMMD_initialization.m': modified accordingly.

Comments (0)

Files changed (7)

+'MMDonline' renamed to 'MMD_online'; see 'DMMD_online_initialization.m', 'DMMD_online_estimation.m'; 'IMMD_DMMD_initialization.m': modified accordingly.
+
 v0.23 (Dec 07, 2012):
 -Three multivariate extensions of Spearman's rho: added; see 'ASpearman1_initialization.m', 'ASpearman1_estimation.m', 'ASpearman2_initialization.m', 'ASpearman2_estimation.m', 'ASpearman3_initialization.m', 'ASpearman3_estimation.m'.
 -Association (A) cost object type: added; see 'A_initialization.m', 'A_estimation.m'. 
 
 ITE can estimate 
 
-- `entropy`: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy,
-- `mutual information`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information, L2 mutual information, Rényi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
-- `divergence`: Kullback-Leibler divergence (relative entropy), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), J-distance (symmetrised Kullback-Leibler divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence,
-- `association measures`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient),
-- `cross quantities`: cross-entropy.
+- `entropy (H)`: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy,
+- `mutual information (I)`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information, L2 mutual information, Rényi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
+- `divergence (D)`: Kullback-Leibler divergence (relative entropy), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), J-distance (symmetrised Kullback-Leibler divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence,
+- `association measures (A)`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient),
+- `cross quantities (C)`: cross-entropy.
 
 ITE offers solution methods for 
 
 
 - the evolution of the ITE code is briefly summarized in CHANGELOG.txt.
 - become a [Follower](https://bitbucket.org/szzoli/ite/follow) to be always up-to-date with ITE.
-- if you have an entropy, mutual information, divergence estimator/subtask solver with a GPLv3(>=)-compatible license that you would like to be embedded into ITE, feel free to contact me.
+- if you have an H/I/D/A/C estimator/subtask solver with a GPLv3(>=)-compatible license that you would like to be embedded into ITE, feel free to contact me.
 
 **Download** the latest release: 
 

code/H_I_D_A_C/base_estimators/DMMD_online_estimation.m

+function [D] = DMMD_online_estimation(Y1,Y2,co)
+%Estimates divergence (D) of Y1 and Y2 using the MMD (maximum mean discrepancy) method, online. 
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel Two-Sample Test. Journal of Machine  Learning Research 13 (2012) 723-773. See Lemma 14.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    [dY1,num_of_samplesY1] = size(Y1);
+    [dY2,num_of_samplesY2] = size(Y2);
+    %size(Y1) must be equal to size(Y2):
+        if num_of_samplesY1~=num_of_samplesY2
+            warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
+        end
+        if dY1~=dY2
+            error('The dimension of the samples in Y1 and Y2 must be equal.');
+        end
+    num_of_samples = min(num_of_samplesY1,num_of_samplesY2);        
+    %Number of samples must be even:
+        if ~all_even(num_of_samples)
+            warning('The number of samples must be even, the last sample is discarded.');
+            num_of_samples = num_of_samples - 1;
+        end
+        
+%initialization: 
+    odd_indices = [1:2:num_of_samples];
+    even_indices = [2:2:num_of_samples];
+    
+%Y1i,Y1j,Y2i,Y2j:
+    Y1i = Y1(:,odd_indices);
+    Y1j = Y1(:,even_indices);
+    Y2i = Y2(:,odd_indices);
+    Y2j = Y2(:,even_indices);
+
+D = (K(Y1i,Y1j,co) + K(Y2i,Y2j,co) - K(Y1i,Y2j,co) - K(Y1j,Y2i,co)) / (num_of_samples/2);
+
+%-----------------------------
+function [s] = K(U,V,co)
+%Computes \sum_i kernel(U(:,i),V(:,i)), RBF (Gaussian) kernel is used with std=co.sigma
+
+s = sum( exp(-sum((U-V).^2,1)/(2*co.sigma^2)) );

code/H_I_D_A_C/base_estimators/DMMD_online_initialization.m

+function [co] = DMMD_online_initialization(mult)
+%Initialization of the online MMD (maximum mean discrepancy) divergence estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'MMD_online';
+    co.mult = mult;
+    
+%other fields:    
+    co.sigma = 0.01;%std in the RBF (Gaussian) kernel

code/H_I_D_A_C/base_estimators/DMMDonline_estimation.m

-function [D] = DMMDonline_estimation(Y1,Y2,co)
-%Estimates divergence (D) of Y1 and Y2 using the MMD (maximum mean discrepancy) method, online. 
-%
-%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%  Y1: Y1(:,t) is the t^th sample from the first distribution.
-%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken.
-%  co: divergence estimator object.
-%
-%REFERENCE: 
-%   Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel Two-Sample Test. Journal of Machine  Learning Research 13 (2012) 723-773. See Lemma 14.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-%verification:
-    [dY1,num_of_samplesY1] = size(Y1);
-    [dY2,num_of_samplesY2] = size(Y2);
-    %size(Y1) must be equal to size(Y2):
-        if num_of_samplesY1~=num_of_samplesY2
-            warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
-        end
-        if dY1~=dY2
-            error('The dimension of the samples in Y1 and Y2 must be equal.');
-        end
-    num_of_samples = min(num_of_samplesY1,num_of_samplesY2);        
-    %Number of samples must be even:
-        if ~all_even(num_of_samples)
-            warning('The number of samples must be even, the last sample is discarded.');
-            num_of_samples = num_of_samples - 1;
-        end
-        
-%initialization: 
-    odd_indices = [1:2:num_of_samples];
-    even_indices = [2:2:num_of_samples];
-    
-%Y1i,Y1j,Y2i,Y2j:
-    Y1i = Y1(:,odd_indices);
-    Y1j = Y1(:,even_indices);
-    Y2i = Y2(:,odd_indices);
-    Y2j = Y2(:,even_indices);
-
-D = (K(Y1i,Y1j,co) + K(Y2i,Y2j,co) - K(Y1i,Y2j,co) - K(Y1j,Y2i,co)) / (num_of_samples/2);
-
-%-----------------------------
-function [s] = K(U,V,co)
-%Computes \sum_i kernel(U(:,i),V(:,i)), RBF (Gaussian) kernel is used with std=co.sigma
-
-s = sum( exp(-sum((U-V).^2,1)/(2*co.sigma^2)) );

code/H_I_D_A_C/base_estimators/DMMDonline_initialization.m

-function [co] = DMMDonline_initialization(mult)
-%Initialization of the online MMD (maximum mean discrepancy) divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'MMDonline';
-    co.mult = mult;
-    
-%other fields:    
-    co.sigma = 0.01;%std in the RBF (Gaussian) kernel

code/H_I_D_A_C/meta_estimators/IMMD_DMMD_initialization.m

     co.mult = mult;
 	
 %other fields:    
-    co.member_name = 'MMDonline'; %you can change it to any MMD estimator
+    co.member_name = 'MMD_online'; %you can change it to any MMD estimator
     co.member_co = D_initialization(co.member_name,mult);
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.