Commits

Zoltan Szabo committed 0f83211

Association (A) cost object type and three multivariate extensions of Spearman's rho: added; see 'A_initialization.m', 'A_estimation.m', 'ASpearman1_initialization.m', 'ASpearman1_estimation.m', 'ASpearman2_initialization.m', 'ASpearman2_estimation.m', 'ASpearman3_initialization.m', 'ASpearman3_estimation.m'. Directory 'H_I_D_C' renamed to 'H_I_D_A_C'; 'ITE_install.m' modified accordingly. Verification of the H/I/D/C function arguments: unified across different estimators. Further verification (compatibility of ds and Y): added to I-estimators (the estimations have not changed). Some comment unification: carried out.

  • Participants
  • Parent commits f90e2f2
  • Tags release-0.23

Comments (0)

Files changed (297)

File CHANGELOG.txt

--Verification of the H/I/D/C function arguments: unified across different estimators. Further verification (compatibility of ds and Y): added to I-estimators (the estimations has not changed). Some comment unification: carried out. 
+v0.23 (Dec 07, 2012):
+-Three multivariate extensions of Spearman's rho: added; see 'ASpearman1_initialization.m', 'ASpearman1_estimation.m', 'ASpearman2_initialization.m', 'ASpearman2_estimation.m', 'ASpearman3_initialization.m', 'ASpearman3_estimation.m'.
+-Association (A) cost object type: added; see 'A_initialization.m', 'A_estimation.m'. 
+-Directory 'H_I_D_C' renamed to 'H_I_D_A_C'; 'ITE_install.m' modified accordingly.
+-Verification of the H/I/D/C function arguments: unified across different estimators. Further verification (compatibility of ds and Y): added to I-estimators (the estimations have not changed). Some comment unification: carried out. 
 
 v0.22 (Dec 01, 2012):
 -Cauchy-Schwartz and Euclidean distance based divergence estimators: added; see 'DCS_KDE_iChol_initialization.m', 'DCS_KDE_iChol_estimation.m', 'DED_KDE_iChol_initialization.m', 'DED_KDE_iChol_estimation.m'.
 #ITE (Information Theoretical Estimators)
 
-ITE is capable of estimating many different variants of entropy, mutual information, divergence measures, and related cross quantities. Thanks to its highly modular design, ITE supports additionally 
+ITE is capable of estimating many different variants of entropy, mutual information, divergence, association measures and cross quantities. Thanks to its highly modular design, ITE supports additionally 
 
 - the combinations of the estimation techniques, 
 - the easy construction and embedding of novel information theoretical estimators, and 
 
 ITE can estimate 
 
-- `entropy`: Shannon entropy, R�nyi entropy, Tsallis entropy, complex entropy,
+- `entropy`: Shannon entropy, R�nyi entropy, Tsallis entropy (Havrda and Charv�t entropy), complex entropy,
 - `mutual information`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information, L2 mutual information, R�nyi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
-- `divergence`: Kullback-Leibler divergence, L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy, J-distance, Cauchy-Schwartz divergence, Euclidean distance based divergence,
-- `cross measures`: cross-entropy.
+- `divergence`: Kullback-Leibler divergence (relative entropy), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), J-distance (symmetrised Kullback-Leibler divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence,
+- `association measures`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient),
+- `cross quantities`: cross-entropy.
 
 ITE offers solution methods for 
 
 
 **Download** the latest release: 
 
-- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.22_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.22_code.tar.bz2), 
-- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.22_documentation.pdf).
+- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.23_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.23_code.tar.bz2), 
+- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.23_documentation.pdf).
 
 

File code/H_I_D_A_C/A_estimation.m

+function [A] = A_estimation(Y,ds,co)
+%Estimates association A(y^1,...,y^M), where the m^th subspace is ds(m)-dimensional; using the method/cost object co.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  ds: subspace dimensions.
+%  co: association estimator object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'A<name>_estimation':
+    eval(['A=A',co.name,'_estimation(Y,ds,co);']);%example: A = ASpearman1_estimation(Y,ds,co);
+    

File code/H_I_D_A_C/A_initialization.m

+function [co] = A_initialization(cost_name,mult)
+%Initialization of an A (association) estimator. The estimator is treated as a cost object (co). 
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention: 'A<name>_initialization':
+   eval(['co=A',cost_name,'_initialization(mult)']), %example: co = ASpearman1_initialization(mult);
+    
+disp('A initialization: ready.');

File code/H_I_D_A_C/C_estimation.m

+function [C] = C_estimation(Y1,Y2,co)
+%Cross estimation (C) of Y1 and Y2 using the specified cross estimator.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. 
+%  co: cross estimator object.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'C<name>_estimation':
+    eval(['C=C',co.name,'_estimation(Y1,Y2,co);']); %example: CE = CCE_kNN_k_estimation(Y1,Y2,co);

File code/H_I_D_A_C/C_initialization.m

+function [co] = C_initialization(cost_name,mult)
+%Initialization of a cross estimator. The estimator is treated as a cost object (co). 
+%   
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'C<name>_initialization':
+   eval(['co=C',cost_name,'_initialization(mult)']); %example: co = CCE_kNN_k_initialization(mult);
+    
+disp('C initialization: ready.');

File code/H_I_D_A_C/D_estimation.m

+function [D] = D_estimation(Y1,Y2,co)
+%Estimates divergence between Y1 and Y2.
+%
+%INPUT:
+%   Y1: Y1(:,t) is the t^th sample from the first distribution. 
+%   Y2: Y2(:,t) is the t^th sample from the second distribution.
+%   co: divergence estimator object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'D<name>_estimation':
+    eval(['D=D',co.name,'_estimation(Y1,Y2,co);']);%example: D = DRenyi_kNN_k_estimation(Y1,Y2,co);

File code/H_I_D_A_C/D_initialization.m

+function [co] = D_initialization(cost_name,mult)
+%Initialization of a divergence estimator. The estimator is treated as a cost object (co). 
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'D<name>_initialization':
+   eval(['co=D',cost_name,'_initialization(mult)']); %example: co = DRenyi_kNN_k_initialization(mult);
+    
+disp('D initialization: ready.');

File code/H_I_D_A_C/H_estimation.m

+function [H] = H_estimation(Y,co)
+%Entropy estimation (H) of Y using the specified entropy estimator.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  co: entropy estimator object.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'H<name>_estimation':
+	eval(['H=H',co.name,'_estimation(Y,co);']);%example: H = HRenyi_kNN_1tok_estimation(Y,co);

File code/H_I_D_A_C/H_initialization.m

+function [co] = H_initialization(cost_name,mult)
+%Initialization of an entropy estimator. The estimator is treated as a cost object (co). 
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'H<name>_initialization':
+   eval(['co=H',cost_name,'_initialization(mult)']); %example: co = HRenyi_kNN_k_initialization(mult);
+    
+disp('H initialization: ready.');

File code/H_I_D_A_C/I_estimation.m

+function [I] = I_estimation(Y,ds,co)
+%Estimates mutual information I(y^1,...,y^M), where the m^th subspace is ds(m)-dimensional (I), using the specified estimator.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  ds: subspace dimensions.
+%  co: mutual information estimator object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'I<name>_estimation':
+    eval(['I=I',co.name,'_estimation(Y,ds,co);']);%example: I = IShannon_HShannon_estimation(Y,ds,co);
+    

File code/H_I_D_A_C/I_initialization.m

+function [co] = I_initialization(cost_name,mult)
+%Initialization of a I (mutual information) estimator. The estimator is treated as a cost object (co). 
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%Here, we make use of the naming convention 'I<name>_initialization':
+   eval(['co=I',cost_name,'_initialization(mult)']), %example: co = IGV_initialization(mult);
+    
+disp('I initialization: ready.');

File code/H_I_D_A_C/base_estimators/ASpearman1_estimation.m

+function [A] = ASpearman1_estimation(Y,ds,co)
+%Estimates the first multivariate extension of Spearman's rho using empirical copulas.
+%
+%We use the naming convention 'A<name>_estimation' to ease embedding new association estimator methods.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  ds: subspace dimensions.
+%  co: association estimator object.
+%
+%REFERENCE:
+%   Friedrich Shmid, Rafael Schmidt, Thomas Blumentritt, Sandra Gaiser, and Martin Ruppert. Copula Theory and Its Applications, Chapter Copula based Measures of Multivariate Association. Lecture Notes in Statistics. Springer, 2010.
+%   C. Spearman. The proof and measurement of association between two things. The American Journal of Psychology, 15:72-101, 1904.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    if sum(ds) ~= size(Y,1);
+        error('The subspace dimensions are not compatible with Y.');
+    end
+    if ~one_dimensional_problem(ds)
+        error('The subspaces must be one-dimensional for this estimator.');
+    end
+    
+[d,num_of_samples] = size(Y); %dimension, number of samples
+U = copula_transformation(Y);
+
+h = (d+1) / (2^d - (d+1)); %h_rho(d)
+
+A = h * (2^d * mean(prod(1-U)) -1);
+

File code/H_I_D_A_C/base_estimators/ASpearman1_initialization.m

+function [co] = ASpearman1_initialization(mult)
+%Initialization of the first multivariate extension of Spearman's rho estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'A<name>_initialization' to ease embedding new association estimator methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'Spearman1';
+    co.mult = mult;
+    
+   

File code/H_I_D_A_C/base_estimators/ASpearman2_estimation.m

+function [A] = ASpearman2_estimation(Y,ds,co)
+%Estimates the second multivariate extension of Spearman's rho using empirical copulas.
+%
+%We use the naming convention 'A<name>_estimation' to ease embedding new association estimator methods.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  ds: subspace dimensions.
+%  co: association estimator object.
+%
+%REFERENCE:
+%   Friedrich Shmid, Rafael Schmidt, Thomas Blumentritt, Sandra Gaiser, and Martin Ruppert. Copula Theory and Its Applications, Chapter Copula based Measures of Multivariate Association. Lecture Notes in Statistics. Springer, 2010.
+%   C. Spearman. The proof and measurement of association between two things. The American Journal of Psychology, 15:72-101, 1904.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    if sum(ds) ~= size(Y,1);
+        error('The subspace dimensions are not compatible with Y.');
+    end
+    if ~one_dimensional_problem(ds)
+        error('The subspaces must be one-dimensional for this estimator.');
+    end
+    
+[d,num_of_samples] = size(Y); %dimension, number of samples
+U = copula_transformation(Y);
+
+h = (d+1) / (2^d - (d+1)); %h_rho(d)
+
+A = h * (2^d * mean(prod(U)) -1);
+

File code/H_I_D_A_C/base_estimators/ASpearman2_initialization.m

+function [co] = ASpearman2_initialization(mult)
+%Initialization of the second multivariate extension of Spearman's rho estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'A<name>_initialization' to ease embedding new association estimator methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'Spearman2';
+    co.mult = mult;
+    
+   

File code/H_I_D_A_C/base_estimators/ASpearman3_estimation.m

+function [A] = ASpearman3_estimation(Y,ds,co)
+%Estimates the second multivariate extension of Spearman's rho using empirical copulas.
+%
+%We use the naming convention 'A<name>_estimation' to ease embedding new association estimator methods.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  ds: subspace dimensions.
+%  co: association estimator object.
+%
+%REFERENCE:
+%   Friedrich Shmid, Rafael Schmidt, Thomas Blumentritt, Sandra Gaiser, and Martin Ruppert. Copula Theory and Its Applications, Chapter Copula based Measures of Multivariate Association. Lecture Notes in Statistics. Springer, 2010.
+%   C. Spearman. The proof and measurement of association between two things. The American Journal of Psychology, 15:72-101, 1904.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    if sum(ds) ~= size(Y,1);
+        error('The subspace dimensions are not compatible with Y.');
+    end
+    if ~one_dimensional_problem(ds)
+        error('The subspaces must be one-dimensional for this estimator.');
+    end
+    
+[d,num_of_samples] = size(Y); %dimension, number of samples
+U = copula_transformation(Y);
+
+h = (d+1) / (2^d - (d+1)); %h_rho(d)
+
+A1 = h * (2^d * mean(prod(1-U)) -1);
+A2 = h * (2^d * mean(prod(U)) -1);
+A = (A1+A2)/2;
+
+

File code/H_I_D_A_C/base_estimators/ASpearman3_initialization.m

+function [co] = ASpearman3_initialization(mult)
+%Initialization of the third multivariate extension of Spearman's rho estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'A<name>_initialization' to ease embedding new association estimator methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'Spearman3';
+    co.mult = mult;
+    
+   

File code/H_I_D_A_C/base_estimators/CCE_kNN_k_estimation.m

+function [CE] = CCE_kNN_k_estimation(Y1,Y2,co)
+%Estimates the cross-entropy (CE) of Y1 and Y2 using the kNN method (S={k}).
+%
+%We use the naming convention 'C<name>_estimation' to ease embedding new cross estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: cross estimator object.
+%
+%REFERENCE: 
+%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153-2182, 2008.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+d1 = size(Y1,1);
+[d2,num_of_samples2] = size(Y2);
+
+%verification:
+    if d1~=d2
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+d = d1;
+V = volume_of_the_unit_ball(d); 
+squared_distancesY1Y2 = kNN_squared_distances(Y2,Y1,co,0);
+dist_k_Y1Y2 = sqrt(squared_distancesY1Y2(end,:));
+CE = log(V) + log(num_of_samples2) - psi(co.k) + d * mean(log(dist_k_Y1Y2));
+

File code/H_I_D_A_C/base_estimators/CCE_kNN_k_initialization.m

+function [co] = CCE_kNN_k_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based cross-entropy estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'C<name>_initialization' to ease embedding new cross estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'CE_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 						
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors				
+		%II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors				
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

File code/H_I_D_A_C/base_estimators/DBhattacharyya_kNN_k_estimation.m

+function [D] = DBhattacharyya_kNN_k_estimation(Y1,Y2,co)
+%Estimates the Bhattacharyya distance of Y1 and Y2 using the kNN method (S={k}).
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302"
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    if size(Y1,1)~=size(Y2,1)
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+%D_ab (Bhattacharyya coefficient):
+	if co.p %[p(x)dx]
+		D_ab = estimate_Dtemp2(Y1,Y2,co);
+	else %[q(x)dx]
+		D_ab = estimate_Dtemp2(Y2,Y1,co);
+	end
+
+%D = -log(D_ab);%theoretically
+D = -log(abs(D_ab));%abs() to avoid possible 'log(negative)' values due to the finite number of samples
+
+

File code/H_I_D_A_C/base_estimators/DBhattacharyya_kNN_k_initialization.m

+function [co] = DBhattacharyya_kNN_k_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Bhattacharyya distance estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'Bhattacharyya_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.        
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k.
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors
+  	    %II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+	%Possibilities for rewriting the Bhattacharyya distance:
+		%I [\int p^{1/2}(x)q^{1/2}(x)dx = \int p^{-1/2}(x)q^{1/2}(x) p(x)dx]:
+			co.p = 1; %use p [p(x)dx]
+		%II [\int p^{1/2}(x)q^{1/2}(x)dx = \int q^{-1/2}(x)p^{1/2}(x) q(x)dx]:
+			%co.p = 0; %use q instead [q(x)dx]
+	%Fixed, do not change it:
+		co.a = -1/2; 
+		co.b = 1/2;
+				
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
+    

File code/H_I_D_A_C/base_estimators/DCS_KDE_iChol_estimation.m

+function [D] = DCS_KDE_iChol_estimation(Y1,Y2,co)
+%Estimates the Cauchy-Schwartz divergence using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition. 
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken. 
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 2883-2887, 2009.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    [dY1,num_of_samplesY1] = size(Y1);
+    [dY2,num_of_samplesY2] = size(Y2);
+
+    if dY1~=dY2
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+    if num_of_samplesY1~=num_of_samplesY2
+        warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
+        num_of_samples = min(num_of_samplesY1,num_of_samplesY2);
+        Y1 = Y1(:,1:num_of_samples);
+        Y2 = Y2(:,1:num_of_samples);
+    end
+
+D = d_cs(Y1.',Y2.',co.sigma);

File code/H_I_D_A_C/base_estimators/DCS_KDE_iChol_initialization.m

+function [co] = DCS_KDE_iChol_initialization(mult)
+%Initialization of the Cauchy-Schwartz divergence estimator using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'CS_KDE_iChol';
+    co.mult = mult;
+   
+%other fields:
+    co.sigma = 1; %std in the RBF (Gaussian) kernel

File code/H_I_D_A_C/base_estimators/DED_KDE_iChol_estimation.m

+function [D] = DED_KDE_iChol_estimation(Y1,Y2,co)
+%Estimates the Euclidean distance based divergence using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition. 
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken. 
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 2883-2887, 2009.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    [dY1,num_of_samplesY1] = size(Y1);
+    [dY2,num_of_samplesY2] = size(Y2);
+
+    if dY1~=dY2
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+    if num_of_samplesY1~=num_of_samplesY2
+        warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
+        num_of_samples = min(num_of_samplesY1,num_of_samplesY2);
+        Y1 = Y1(:,1:num_of_samples);
+        Y2 = Y2(:,1:num_of_samples);
+    end
+
+D = d_ed(Y1.',Y2.',co.sigma);

File code/H_I_D_A_C/base_estimators/DED_KDE_iChol_initialization.m

+function [co] = DED_KDE_iChol_initialization(mult)
+%Initialization of the Euclidean distance based divergence estimator using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'ED_KDE_iChol';
+    co.mult = mult;
+   
+%other fields:
+    co.sigma = 1; %std in the RBF (Gaussian) kernel

File code/H_I_D_A_C/base_estimators/DHellinger_kNN_k_estimation.m

+function [D] = DHellinger_kNN_k_estimation(Y1,Y2,co)
+%Estimates the Hellinger distance of Y1 and Y2 using the kNN method (S={k}).
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302"
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    if size(Y1,1)~=size(Y2,1)
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+%D_ab (Bhattacharyya coefficient):
+	if co.p %[p(x)dx]
+		D_ab = estimate_Dtemp2(Y1,Y2,co);
+	else %[q(x)dx]
+		D_ab = estimate_Dtemp2(Y2,Y1,co);
+	end
+
+%D = sqrt(1-D_ab);%theoretically
+D = sqrt(abs(1-D_ab));%abs() to avoid possible 'sqrt(negative)' values due to the finite number of samples

File code/H_I_D_A_C/base_estimators/DHellinger_kNN_k_initialization.m

+function [co] = DHellinger_kNN_k_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Hellinger distance estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'Hellinger_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.        
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k.
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors
+  	    %II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+	%Possibilities for rewriting the Hellinger distance:
+		%I [\int p^{1/2}(x)q^{1/2}(x)dx = \int p^{-1/2}(x)q^{1/2}(x) p(x)dx]:
+			co.p = 1; %use p [p(x)dx]
+		%II [\int p^{1/2}(x)q^{1/2}(x)dx = \int q^{-1/2}(x)p^{1/2}(x) q(x)dx]:
+			%co.p = 0; %use q instead [q(x)dx]
+	%Fixed, do not change it:
+		co.a = -1/2; 
+		co.b = 1/2;
+				
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
+    

File code/H_I_D_A_C/base_estimators/DKL_kNN_k_estimation.m

+function [D] = DKL_kNN_k_estimation(Y1,Y2,co)
+%Estimates the Kullback-Leibler divergence (D) of Y1 and Y2 using the kNN method (S={k}). 
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Fernando Perez-Cruz. Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems (NIPS), pp. 1257-1264, 2008.
+%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153-2182, 2008.
+%   Quing Wang, Sanjeev R. Kulkarni, and Sergio Verdu. Divergence estimation for multidimensional densities via k-nearest-neighbor distances. IEEE Transactions on Information Theory, 55:2392-2405, 2009.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+[dY1,num_of_samplesY1] = size(Y1);
+[dY2,num_of_samplesY2] = size(Y2);
+
+%verification:
+    if dY1~=dY2
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+d = dY1;
+squared_distancesY1Y1 = kNN_squared_distances(Y1,Y1,co,1);
+squared_distancesY2Y1 = kNN_squared_distances(Y2,Y1,co,0);
+dist_k_Y1Y1 = sqrt(squared_distancesY1Y1(end,:));
+dist_k_Y2Y1 = sqrt(squared_distancesY2Y1(end,:));
+D = d * mean(log(dist_k_Y2Y1./dist_k_Y1Y1)) + log(num_of_samplesY2/(num_of_samplesY1-1));

File code/H_I_D_A_C/base_estimators/DKL_kNN_k_initialization.m

+function [co] = DKL_kNN_k_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Kullback-Leibler divergence estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'KL_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 						
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors				
+		%II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors				
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

File code/H_I_D_A_C/base_estimators/DKL_kNN_kiTi_estimation.m

+function [D] = DKL_kNN_kiTi_estimation(Y1,Y2,co)
+%Estimates the Kullback-Leibler divergence (D) of Y1 and Y2 using the kNN method (S={k}).
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Quing Wang, Sanjeev R. Kulkarni, and Sergio Verdu. Divergence estimation for multidimensional densities via k-nearest-neighbor distances. IEEE Transactions on Information Theory, 55:2392-2405, 2009.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+[dY1,num_of_samplesY1] = size(Y1);
+[dY2,num_of_samplesY2] = size(Y2);
+
+%verification:
+    if dY1~=dY2
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+d = dY1;
+k1 = floor(sqrt(num_of_samplesY1));
+k2 = floor(sqrt(num_of_samplesY2));
+    
+co.k = k1;
+squared_distancesY1Y1 = kNN_squared_distances(Y1,Y1,co,1);
+    
+co.k = k2;
+squared_distancesY2Y1 = kNN_squared_distances(Y2,Y1,co,0);
+    
+dist_k_Y1Y1 = sqrt(squared_distancesY1Y1(end,:));
+dist_k_Y2Y1 = sqrt(squared_distancesY2Y1(end,:));
+D = d * mean(log(dist_k_Y2Y1./dist_k_Y1Y1)) + log( k1/k2 * num_of_samplesY2/(num_of_samplesY1-1) );
+

File code/H_I_D_A_C/base_estimators/DKL_kNN_kiTi_initialization.m

+function [co] = DKL_kNN_kiTi_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S_1={k_1}, S_2={k_2}) based Kullback-Leibler divergence estimator. Here, ki-s depend on the number of samples, they are set in 'DKL_kNN_kiTi_estimation.m'.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'KL_kNN_kiTi';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 						
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
+		%I:
+            co.kNNmethod = 'knnFP1';
+		%II:
+            %co.kNNmethod = 'knnFP2';
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

File code/H_I_D_A_C/base_estimators/DL2_kNN_k_estimation.m

+function [D] = DL2_kNN_k_estimation(Y1,Y2,co)
+%Estimates the L2 divergence (D) of Y1 and Y2 using the kNN method (S={k}).
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. European Signal Processing Conference (EUSIPCO), pages 1849-1853, 2011.
+%   Barnabas Poczos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. Uncertainty in Artificial Intelligence (UAI), 2011.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+[dY2,num_of_samplesY2] = size(Y2);
+[dY1,num_of_samplesY1] = size(Y1);
+
+%verification:
+    if dY1~=dY2
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+d = dY1;
+c = volume_of_the_unit_ball(d);%volume of the d-dimensional unit ball
+
+squared_distancesY1Y1 = kNN_squared_distances(Y1,Y1,co,1);
+squared_distancesY2Y1 = kNN_squared_distances(Y2,Y1,co,0);
+dist_k_Y1Y1 = sqrt(squared_distancesY1Y1(end,:));
+dist_k_Y2Y1 = sqrt(squared_distancesY2Y1(end,:));
+
+term1 = mean(dist_k_Y1Y1.^(-d)) * (co.k-1) / ((num_of_samplesY1-1)*c);
+term2 = mean(dist_k_Y2Y1.^(-d)) * 2 * (co.k-1) / (num_of_samplesY2*c);
+term3 = mean((dist_k_Y1Y1.^d) ./ (dist_k_Y2Y1.^(2*d))) *  (num_of_samplesY1 - 1) * (co.k-2) * (co.k-1) / (num_of_samplesY2^2*c*co.k);
+L2 = term1-term2+term3;
+%D = sqrt(L2);%theoretically
+D = sqrt(abs(L2));%due to the finite number of samples L2 can be negative. In this case: sqrt(L2) is complex; to avoid such values we take abs().

File code/H_I_D_A_C/base_estimators/DL2_kNN_k_initialization.m

+function [co] = DL2_kNN_k_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based L2 divergence estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'L2_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                       
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 								
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.         
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors				
+		%II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors				
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
+            

File code/H_I_D_A_C/base_estimators/DMMDonline_estimation.m

+function [D] = DMMDonline_estimation(Y1,Y2,co)
+%Estimates divergence (D) of Y1 and Y2 using the MMD (maximum mean discrepancy) method, online. 
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel Two-Sample Test. Journal of Machine  Learning Research 13 (2012) 723-773. See Lemma 14.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+%verification:
+    [dY1,num_of_samplesY1] = size(Y1);
+    [dY2,num_of_samplesY2] = size(Y2);
+    %size(Y1) must be equal to size(Y2):
+        if num_of_samplesY1~=num_of_samplesY2
+            warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
+        end
+        if dY1~=dY2
+            error('The dimension of the samples in Y1 and Y2 must be equal.');
+        end
+    num_of_samples = min(num_of_samplesY1,num_of_samplesY2);        
+    %Number of samples must be even:
+        if ~all_even(num_of_samples)
+            warning('The number of samples must be even, the last sample is discarded.');
+            num_of_samples = num_of_samples - 1;
+        end
+        
+%initialization: 
+    odd_indices = [1:2:num_of_samples];
+    even_indices = [2:2:num_of_samples];
+    
+%Y1i,Y1j,Y2i,Y2j:
+    Y1i = Y1(:,odd_indices);
+    Y1j = Y1(:,even_indices);
+    Y2i = Y2(:,odd_indices);
+    Y2j = Y2(:,even_indices);
+
+D = (K(Y1i,Y1j,co) + K(Y2i,Y2j,co) - K(Y1i,Y2j,co) - K(Y1j,Y2i,co)) / (num_of_samples/2);
+
+%-----------------------------
+function [s] = K(U,V,co)
+%Computes \sum_i kernel(U(:,i),V(:,i)), RBF (Gaussian) kernel is used with std=co.sigma
+
+s = sum( exp(-sum((U-V).^2,1)/(2*co.sigma^2)) );

File code/H_I_D_A_C/base_estimators/DMMDonline_initialization.m

+function [co] = DMMDonline_initialization(mult)
+%Initialization of the online MMD (maximum mean discrepancy) divergence estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'MMDonline';
+    co.mult = mult;
+    
+%other fields:    
+    co.sigma = 0.01;%std in the RBF (Gaussian) kernel

File code/H_I_D_A_C/base_estimators/DRenyi_kNN_k_estimation.m

+function [D] = DRenyi_kNN_k_estimation(Y1,Y2,co)
+%Estimates the Renyi divergence (D) of Y1 and Y2 using the kNN method (S={k}). 
+%
+%We use the naming convention 'D<name>_estimation' to ease embedding new divergence estimation methods.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: divergence estimator object.
+%
+%REFERENCE: 
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider. Nonparametric divergence estimators for Independent Subspace Analysis. European Signal Processing Conference (EUSIPCO), pages 1849-1853, 2011.
+%   Barnabas Poczos, Jeff Schneider. On the Estimation of alpha-Divergences. International conference on Artificial Intelligence and Statistics (AISTATS), pages 609-617, 2011.
+%   Barnabas Poczos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. Uncertainty in Artificial Intelligence (