Zoltan Szabo avatar Zoltan Szabo committed 184c85e

Cross (C) cost object type, cross-entropy estimation based on k-nearest neighbors, Kullback-Leibler divergence meta estimator based on cross-entropy and entropy: added; see 'C_initialization.m', 'C_estimation.m', 'CCE_kNN_k_initialization.m', 'CCE_kNN_k_estimation.m', 'DKL_CCE_HShannon_initialization.m', 'DKL_CCE_HShannon_estimation.m'.

Comments (0)

Files changed (263)

+v0.21 (Nov 25, 2012):
+-Kullback-Leibler divergence meta estimator based on cross-entropy and entropy: added; see 'DKL_CCE_HShannon_initialization.m', 'DKL_CCE_HShannon_estimation.m'.
+-Cross-entropy estimation based on k-nearest neighbors: added; see 'CCE_kNN_k_initialization.m', 'CCE_kNN_k_estimation.m'.
+-Cross (C) cost object type: added; see 'C_initialization.m', 'C_estimation.m'. 
+-Directory 'H_I_D' renamed to 'H_I_D_C'; 'ITE_install.m' modified accordingly.
 -IGV_estimation: added (#2); documentation: modified accordingly.
 
 v0.20 (Nov 19, 2012):
 #ITE (Information Theoretical Estimators)
 
-ITE is capable of estimating many different variants of entropy, mutual information and divergence measures. Thanks to its highly modular design, ITE supports additionally 
+ITE is capable of estimating many different variants of entropy, mutual information, divergence measures, and related cross quantities. Thanks to its highly modular design, ITE supports additionally 
 
 - the combinations of the estimation techniques, 
 - the easy construction and embedding of novel information theoretical estimators, and 
 - multi-platform (tested extensively on Windows and Linux),
 - free and open source (released under the GNU GPLv3(>=) license).
 
-ITE can estimate Shannon-, R�nyi-, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, R�nyi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa; complex variants of entropy and mutual information; L2-, R�nyi-, Tsallis-, Kullback-Leibler divergence; Hellinger-, Bhattacharyya distance; maximum mean discrepancy, and J-distance.
+ITE can estimate Shannon-, R�nyi-, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, R�nyi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa; complex variants of entropy and mutual information; L2-, R�nyi-, Tsallis-, Kullback-Leibler divergence; Hellinger-, Bhattacharyya distance; maximum mean discrepancy, J-distance, and cross-entropy.
 
 ITE offers solution methods for 
 
 
 **Download** the latest release: 
 
-- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.20_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.20_code.tar.bz2), 
-- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.20_documentation.pdf).
+- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.21_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.21_code.tar.bz2), 
+- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.21_documentation.pdf).
 
 

code/H_I_D/D_estimation.m

-function [D] = D_estimation(Y1,Y2,co)
-%Estimates divergence between Y1 and Y2; using the method/cost object co.
-%
-%INPUT:
-%   Y1,Y2: Y1(:,t) is the t^th sample from the first distribution. Y2(:,t) is the t^th sample from the second distribution.
-%  co: divergence estimator object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%Here, we make use of the naming convention 'D<name>_estimation':
-    eval(['D=D',co.name,'_estimation(Y1,Y2,co);']);%example: D = DRenyi_kNN_k_estimation(Y1,Y2,co);

code/H_I_D/D_initialization.m

-function [co] = D_initialization(cost_name,mult)
-%Initialization of a divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co). 
-%   2)Here, we make use of the naming convention: 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-eval(['co=D',cost_name,'_initialization(mult)']); %example: co = DRenyi_kNN_k_initialization(mult);
-    
-disp('D initialization: ready.');

code/H_I_D/H_estimation.m

-function [H] = H_estimation(Y,co)
-%Entropy estimation (H) of Y (Y(:,t) is the t^th sample) using the specified entropy estimator/cost object co.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%Here, we make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods:
-	eval(['H=H',co.name,'_estimation(Y,co);']);%example: H = HRenyi_kNN_1tok_estimation(Y,co);

code/H_I_D/H_initialization.m

-function [co] = H_initialization(cost_name,mult)
-%Initialization of an entropy estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co). 
-%   2)Here, we make use of the naming convention: 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-eval(['co=H',cost_name,'_initialization(mult)']); %example: co = HRenyi_kNN_k_initialization(mult);
-    
-disp('H initialization: ready.');

code/H_I_D/I_estimation.m

-function [I] = I_estimation(Y,ds,co)
-%Estimates mutual information I(y^1,...,y^M), where the m^th subspace is
-%ds(m)-dimensional; using the method/cost object co.
-%
-%INPUT:
-%   Y: Y(:,t) is the t^th sample.
-%  ds: subspace dimensions.
-%  co: mutual information estimator object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%Here, we make use of the naming convention 'I<name>_estimation':
-    eval(['I=I',co.name,'_estimation(Y,ds,co);']);%example: I = IShannon_HShannon_estimation(Y,ds,co);
-    

code/H_I_D/I_initialization.m

-function [co] = I_initialization(cost_name,mult)
-%Initialization of a I (mutual information) estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co). 
-%   2)Here, we make use of the naming convention: 'I<name>_initialization', to ease embedding new mutual information estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-eval(['co=I',cost_name,'_initialization(mult)']), %example: co = IGV_initialization(mult);
-    
-disp('I initialization: ready.');

code/H_I_D/base_estimators/DBhattacharyya_kNN_k_estimation.m

-function [D] = DBhattacharyya_kNN_k_estimation(Y1,Y2,co)
-%Estimates the Bhattacharyya distance of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%	Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302"
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-%verification:
-    if size(Y1,1)~=size(Y2,1)
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-%D_ab (Bhattacharyya coefficient):
-	if co.p %[p(x)dx]
-		D_ab = estimate_Dtemp2(Y1,Y2,co);
-	else %[q(x)dx]
-		D_ab = estimate_Dtemp2(Y2,Y1,co);
-	end
-
-%D = -log(D_ab);%theoretically
-D = -log(abs(D_ab));%abs() to avoid possible 'log(negative)' values due to the finite number of samples
-
-

code/H_I_D/base_estimators/DBhattacharyya_kNN_k_initialization.m

-function [co] = DBhattacharyya_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Bhattacharyya distance estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Bhattacharyya_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.        
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k.
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors
-  	    %II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-	%Possibilities for rewriting the Bhattacharyya distance:
-		%I [\int p^{1/2}(x)q^{1/2}(x)dx = \int p^{-1/2}(x)q^{1/2}(x) p(x)dx]:
-			co.p = 1; %use p [p(x)dx]
-		%II [\int p^{1/2}(x)q^{1/2}(x)dx = \int q^{-1/2}(x)p^{1/2}(x) q(x)dx]:
-			%co.p = 0; %use q instead [q(x)dx]
-	%Fixed, do not change it:
-		co.a = -1/2; 
-		co.b = 1/2;
-				
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    

code/H_I_D/base_estimators/DHellinger_kNN_k_estimation.m

-function [D] = DHellinger_kNN_k_estimation(Y1,Y2,co)
-%Estimates the Hellinger distance of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%	Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302"
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-%verification:
-    if size(Y1,1)~=size(Y2,1)
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-%D_ab (Bhattacharyya coefficient):
-	if co.p %[p(x)dx]
-		D_ab = estimate_Dtemp2(Y1,Y2,co);
-	else %[q(x)dx]
-		D_ab = estimate_Dtemp2(Y2,Y1,co);
-	end
-
-%D = sqrt(1-D_ab);%theoretically
-D = sqrt(abs(1-D_ab));%abs() to avoid possible 'sqrt(negative)' values due to the finite number of samples

code/H_I_D/base_estimators/DHellinger_kNN_k_initialization.m

-function [co] = DHellinger_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Hellinger distance estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Hellinger_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.        
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k.
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors
-  	    %II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-	%Possibilities for rewriting the Hellinger distance:
-		%I [\int p^{1/2}(x)q^{1/2}(x)dx = \int p^{-1/2}(x)q^{1/2}(x) p(x)dx]:
-			co.p = 1; %use p [p(x)dx]
-		%II [\int p^{1/2}(x)q^{1/2}(x)dx = \int q^{-1/2}(x)p^{1/2}(x) q(x)dx]:
-			%co.p = 0; %use q instead [q(x)dx]
-	%Fixed, do not change it:
-		co.a = -1/2; 
-		co.b = 1/2;
-				
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    

code/H_I_D/base_estimators/DKL_kNN_k_estimation.m

-function [D] = DKL_kNN_k_estimation(Y1,Y2,co)
-%Estimates the Kullback-Leibler divergence (D) of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%   Fernando Perez-Cruz. Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems (NIPS), pp. 1257-1264, 2008.
-%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153-2182, 2008.
-%   Quing Wang, Sanjeev R. Kulkarni, and Sergio Verdu. Divergence estimation for multidimensional densities via k-nearest-neighbor distances. IEEE Transactions on Information Theory, 55:2392-2405, 2009.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[dY1,num_of_samplesY1] = size(Y1);
-[dY2,num_of_samplesY2] = size(Y2);
-
-%verification:
-    if dY1~=dY2
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-d = dY1;
-squared_distancesY1Y1 = kNN_squared_distances(Y1,Y1,co,1);
-squared_distancesY2Y1 = kNN_squared_distances(Y2,Y1,co,0);
-dist_k_Y1Y1 = sqrt(squared_distancesY1Y1(end,:));
-dist_k_Y2Y1 = sqrt(squared_distancesY2Y1(end,:));
-D = d * mean(log(dist_k_Y2Y1./dist_k_Y1Y1)) + log(num_of_samplesY2/(num_of_samplesY1-1));

code/H_I_D/base_estimators/DKL_kNN_k_initialization.m

-function [co] = DKL_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Kullback-Leibler divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'KL_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 						
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors				
-		%II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors				
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/DKL_kNN_kiTi_estimation.m

-function [D] = DKL_kNN_kiTi_estimation(Y1,Y2,co)
-%Estimates the Kullback-Leibler divergence (D) of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%   Quing Wang, Sanjeev R. Kulkarni, and Sergio Verdu. Divergence estimation for multidimensional densities via k-nearest-neighbor distances. IEEE Transactions on Information Theory, 55:2392-2405, 2009.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[dY1,num_of_samplesY1] = size(Y1);
-[dY2,num_of_samplesY2] = size(Y2);
-
-%verification:
-    if dY1~=dY2
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-d = dY1;
-k1 = floor(sqrt(num_of_samplesY1));
-k2 = floor(sqrt(num_of_samplesY2));
-    
-co.k = k1;
-squared_distancesY1Y1 = kNN_squared_distances(Y1,Y1,co,1);
-    
-co.k = k2;
-squared_distancesY2Y1 = kNN_squared_distances(Y2,Y1,co,0);
-    
-dist_k_Y1Y1 = sqrt(squared_distancesY1Y1(end,:));
-dist_k_Y2Y1 = sqrt(squared_distancesY2Y1(end,:));
-D = d * mean(log(dist_k_Y2Y1./dist_k_Y1Y1)) + log( k1/k2 * num_of_samplesY2/(num_of_samplesY1-1) );
-

code/H_I_D/base_estimators/DKL_kNN_kiTi_initialization.m

-function [co] = DKL_kNN_kiTi_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S_1={k_1}, S_2={k_2}) based Kullback-Leibler divergence estimator. Here, ki-s depend on the number of samples, they are set in 'DKullback_Leibler_kNN_kiTi_estimation.m'.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'KL_kNN_kiTi';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 						
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-		%II:
-            %co.kNNmethod = 'knnFP2';
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/DL2_kNN_k_estimation.m

-function [D] = DL2_kNN_k_estimation(Y1,Y2,co)
-%Estimates the L2 divergence (D) of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnabas Poczos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. UAI-2011.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[dY2,num_of_samplesY2] = size(Y2);
-[dY1,num_of_samplesY1] = size(Y1);
-
-%verification:
-    if dY1~=dY2
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-d = dY1;
-c = volume_of_the_unit_ball(d);%volume of the d-dimensional unit ball
-
-squared_distancesY1Y1 = kNN_squared_distances(Y1,Y1,co,1);
-squared_distancesY2Y1 = kNN_squared_distances(Y2,Y1,co,0);
-dist_k_Y1Y1 = sqrt(squared_distancesY1Y1(end,:));
-dist_k_Y2Y1 = sqrt(squared_distancesY2Y1(end,:));
-
-term1 = mean(dist_k_Y1Y1.^(-d)) * (co.k-1) / ((num_of_samplesY1-1)*c);
-term2 = mean(dist_k_Y2Y1.^(-d)) * 2 * (co.k-1) / (num_of_samplesY2*c);
-term3 = mean((dist_k_Y1Y1.^d) ./ (dist_k_Y2Y1.^(2*d))) *  (num_of_samplesY1 - 1) * (co.k-2) * (co.k-1) / (num_of_samplesY2^2*c*co.k);
-L2 = term1-term2+term3;
-%D = sqrt(L2);%theoretically
-D = sqrt(abs(L2));%due to the finite number of samples L2 can be negative. In this case: sqrt(L2) is complex; to avoid such values we take abs().

code/H_I_D/base_estimators/DL2_kNN_k_initialization.m

-function [co] = DL2_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based L2 divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'L2_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                       
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 								
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.         
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors				
-		%II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors				
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-            

code/H_I_D/base_estimators/DMMDonline_estimation.m

-function [D] = DMMDonline_estimation(Y1,Y2,co)
-%Estimates divergence (D) of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample) using the MMD (maximum mean discrepancy) method, online. The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%  Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel Two-Sample Test. Journal of Machine  Learning Research 13 (2012) 723-773. See Lemma 14.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-%verification:
-    [dY1,num_of_samplesY1] = size(Y1);
-    [dY2,num_of_samplesY2] = size(Y2);
-    %size(Y1) must be equal to size(Y2):
-        if num_of_samplesY1~=num_of_samplesY2
-            warning('There must be equal number of samples in Y1 and Y2. Minimum of the sample numbers has been taken.');
-        end
-        if dY1~=dY2
-            error('The dimension of Y1 and Y2 must be equal.');
-        end
-    num_of_samples = min(num_of_samplesY1,num_of_samplesY2);        
-    %Number of samples must be even:
-        if ~all_even(num_of_samples)
-            warning('The number of samples must be even, the last sample is discarded.');
-            num_of_samples = num_of_samples - 1;
-        end
-        
-%initialization: 
-    odd_indices = [1:2:num_of_samples];
-    even_indices = [2:2:num_of_samples];
-    
-%Y1i,Y1j,Y2i,Y2j:
-    Y1i = Y1(:,odd_indices);
-    Y1j = Y1(:,even_indices);
-    Y2i = Y2(:,odd_indices);
-    Y2j = Y2(:,even_indices);
-
-D = (K(Y1i,Y1j,co) + K(Y2i,Y2j,co) - K(Y1i,Y2j,co) - K(Y1j,Y2i,co)) / (num_of_samples/2);
-
-%-----------------------------
-function [s] = K(U,V,co)
-%Computes \sum_i kernel(U(:,i),V(:,i)), RBF (Gaussian) kernel is used with std=co.sigma
-
-s = sum( exp(-sum((U-V).^2,1)/(2*co.sigma^2)) );

code/H_I_D/base_estimators/DMMDonline_initialization.m

-function [co] = DMMDonline_initialization(mult)
-%Initialization of the online MMD (maximum mean discrepancy) divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'MMDonline';
-    co.mult = mult;
-    
-%other fields:    
-    co.sigma = 0.01;%std in the RBF (Gaussian) kernel

code/H_I_D/base_estimators/DRenyi_kNN_k_estimation.m

-function [D] = DRenyi_kNN_k_estimation(Y1,Y2,co)
-%Estimates the Renyi divergence (D) of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnabas Poczos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
-%   Barnabas Poczos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. UAI-2011.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-%verification:
-    if size(Y1,1)~=size(Y2,1)
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-Dtemp1 = estimate_Dtemp1(Y1,Y2,co);
-D = log(Dtemp1) / (co.alpha-1);

code/H_I_D/base_estimators/DRenyi_kNN_k_initialization.m

-function [co] = DRenyi_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Renyi divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 						
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors				
-		%II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors				
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-            
-    co.alpha = 0.99; %The Renyi divergence (D_{R,alpha}) equals to the Kullback-Leibler divergence (D) in limit: D_{R,alpha} -> D, as alpha -> 1.
-
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/DTsallis_kNN_k_estimation.m

-function [D] = DTsallis_kNN_k_estimation(Y1,Y2,co)
-%Estimates the Tsallis divergence (D) of Y1 and Y2 (Y1(:,t), Y2(:,t) is the t^th sample)
-%using the kNN method (S={k}). The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
-%
-%REFERENCE: 
-%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnabas Poczos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-%verification:
-    if size(Y1,1)~=size(Y2,1)
-        error('The dimension of Y1 and Y2 must be equal.');
-    end
-
-Dtemp1 = estimate_Dtemp1(Y1,Y2,co);
-D = (Dtemp1-1)/(co.alpha-1);

code/H_I_D/base_estimators/DTsallis_kNN_k_initialization.m

-function [co] = DTsallis_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Tsallis divergence estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Tsallis_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.        
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k.
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors
-  	    %II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-				
-    co.alpha = 0.99; %The Tsallis divergence (D_{T,alpha}) equals to the Kullback-Leibler divergence (D) in limit: D_{T,alpha} -> D, as alpha -> 1.
-
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    

code/H_I_D/base_estimators/HRenyi_GSF_estimation.m

-function [H] = HRenyi_GSF_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
-%using the GSF method. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
-%
-%REFERENCE: 
-%     Jose A. Costa, Alfred O. Hero. Geodesic entropic graphs for dimension and entropy estimation in manifold learning. IEEE Transactions on Signal Processing 52(8):2210-2221, 2004.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[d,num_of_samples] = size(Y);%dimension, number of samples
-
-%length (L):
-    L = compute_length_HRenyi_GSF(Y,co);
-
-%estimation:
-    if co.additive_constant_is_relevant %the additive constant is relevant in the Renyi entropy estimation
-        FN = filename_of_HRenyi_constant(d,co);
-        if exist(FN)
-            load(FN,'constant');
-            H = log( L / (constant*num_of_samples^co.alpha) ) / (1-co.alpha);
-        else
-            error('The file containing the additive constant does not exist. You can precompute the additive constant via estimate_HRenyi_constant.m.');
-        end
-    else %estimation up to an additive constant
-        H = log( L / num_of_samples^co.alpha ) / (1-co.alpha);
-    end

code/H_I_D/base_estimators/HRenyi_GSF_initialization.m

-function [co] = HRenyi_GSF_initialization(mult)
-%Initialization of the GSF (geodesic spanning forest) based Renyi entropy estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2) We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_GSF';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 										
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi. 
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors
-		%II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-            
-    %Method to compute the geodesic spanning forest:
-        co.GSFmethod = 'MatlabBGL_Kruskal';
-
-    co.alpha = 0.99; %The Renyi entropy (H_{R,alpha}) equals to the Shannon differential entropy (H) in limit: H_{R,alpha} -> H, as alpha -> 1.
-    co.additive_constant_is_relevant = 0; %1:additive constant is relevant (you can precompute it via 'estimate_HRenyi_constant.m'), 0:not relevant
-    
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    

code/H_I_D/base_estimators/HRenyi_MST_estimation.m

-function [H] = HRenyi_MST_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
-%using the minimum spanning tree (MST). Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
-%
-%REFERENCE: 
-%   Joseph E. Yukich. Probability Theory of Classical Euclidean Optimization Problems, Lecture Notes in Mathematics, 1998, vol. 1675.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[d,num_of_samples] = size(Y);%dimension, number of samples
-
-%length (L):
-    L = compute_length_HRenyi_MST(Y,co);
-
-%estimation:
-    if co.additive_constant_is_relevant %the additive constant is relevant in the Renyi entropy estimation
-        FN = filename_of_HRenyi_constant(d,co);
-        if exist(FN)
-            load(FN,'constant');
-            H = log( L / (constant*num_of_samples^co.alpha) ) / (1-co.alpha);
-        else
-            error('The file containing the additive constant does not exist. You can precompute the additive constant via estimate_HRenyi_constant.m.');
-        end
-    else %estimation up to an additive constant
-        H = log( L / num_of_samples^co.alpha ) / (1-co.alpha);
-    end

code/H_I_D/base_estimators/HRenyi_MST_initialization.m

-function [co] = HRenyi_MST_initialization(mult)
-%Initialization of the minimum spanning tree (MST) based Renyi entropy estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_MST';
-    co.mult = mult;
-    
-%other fields:
-    co.alpha = 0.99; %The Renyi entropy (H_{R,alpha}) equals to the Shannon differential entropy (H) in limit: H_{R,alpha} -> H, as alpha -> 1.
-    %Possibilites for the MST (minimum spanning tree) method:
-        co.MSTmethod = 'MatlabBGL_Prim';
-        %co.MSTmethod = 'MatlabBGL_Kruskal';
-        %co.MSTmethod = 'pmtk3_Prim';
-        %co.MSTmethod = 'pmtk3_Kruskal';    
-    co.additive_constant_is_relevant = 0; %1:additive constant is relevant (you can precompute it via 'estimate_HRenyi_constant.m'), 0:not relevant

code/H_I_D/base_estimators/HRenyi_kNN_1tok_estimation.m

-function [H] = HRenyi_kNN_1tok_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample) using the kNN method (S={1,...,k}). Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
-%
-%REFERENCE: 
-%   Barnabas Poczos, Andras Lorincz. Independent Subspace Analysis Using k-Nearest Neighborhood Estimates. ICANN-2005, pages 163-168.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[d,num_of_samples] = size(Y);%dimension, number of samples
-
-%length (L):
-    L = compute_length_HRenyi_kNN_1tok(Y,co);
-
-%estimation:
-    if co.additive_constant_is_relevant %the additive constant is relevant in the Renyi entropy estimation
-        FN = filename_of_HRenyi_constant(d,co);
-        if exist(FN)
-            load(FN,'constant');
-            H = log( L / (constant*num_of_samples^co.alpha) ) / (1-co.alpha);
-        else
-            error('The file containing the additive constant does not exist. You can precompute the additive constant via estimate_HRenyi_constant.m.');
-        end
-    else %estimation up to an additive constant
-        H = log( L / num_of_samples^co.alpha ) / (1-co.alpha);
-    end
-  
-    
-
-
-
-
-
-

code/H_I_D/base_estimators/HRenyi_kNN_1tok_initialization.m

-function [co] = HRenyi_kNN_1tok_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={1,...,k}) based Renyi entropy estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2) We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_kNN_1tok';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                        
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 										
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.         
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors
-		%II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-
-    co.alpha = 0.99; %The Renyi entropy (H_{R,alpha}) equals to the Shannon differential entropy (H) in limit: H_{R,alpha} -> H, as alpha -> 1.
-    co.additive_constant_is_relevant = 0; %1:additive constant is relevant (you can precompute it via 'estimate_HRenyi_constant.m'), 0:not relevant
-    
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    

code/H_I_D/base_estimators/HRenyi_kNN_S_estimation.m

-function [H] = HRenyi_kNN_S_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample) using the generalized k-nearest neighbor (S\subseteq {1,...,k}) method. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
-%
-%REFERENCE: 
-%   David Pal, Barnabas Poczos, Csaba Szepesvari: Estimation of Renyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs. NIPS-2010, pages 1849-1857.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-[d,num_of_samples] = size(Y);%dimension, number of samples
-
-%length (L):
-    L = compute_length_HRenyi_kNN_S(Y,co);
-
-%estimation:
-    if co.additive_constant_is_relevant %the additive constant is relevant in the Renyi entropy estimation
-        FN = filename_of_HRenyi_constant(d,co);
-        if exist(FN)
-            load(FN,'constant');
-            H = log( L / (constant*num_of_samples^co.alpha) ) / (1-co.alpha);
-        else
-            error('The file containing the additive constant does not exist. You can precompute the additive constant via estimate_HRenyi_constant.m.');
-        end
-    else %estimation up to an additive constant
-        H = log( L / num_of_samples^co.alpha ) / (1-co.alpha);
-    end
-  

code/H_I_D/base_estimators/HRenyi_kNN_S_initialization.m

-function [co] = HRenyi_kNN_S_initialization(mult)
-%Initialization of the generalized k-nearest neighbor (S\subseteq {1,...,k}) based Renyi entropy estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_kNN_S';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.        
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 												
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').        
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = [1,2,4];%=S: nearest neighbor set
-   	    %II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = [1,2,4];%=S: nearest neighbor set
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = [1,2,4];%=S: nearest neighbor set
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = [1,2,4];%=S: nearest neighbor set
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-				
-    co.alpha = 0.99; %The Renyi entropy (H_{R,alpha) equals to the Shannon differential entropy (H) in limit: H_{R,alpha} -> H, as alpha -> 1.
-    co.additive_constant_is_relevant = 0; %1:additive constant is relevant (you can precompute it via 'estimate_HRenyi_constant.m'), 0:not relevant    
-    
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_kNN_k_estimation.m

-function [H] = HRenyi_kNN_k_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
-%using the kNN method (S={k}). Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
-%
-%REFERENCE: 
-%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153�2182, 2008.
-%   Joseph E. Yukich. Probability Theory of Classical Euclidean Optimization Problems, Lecture Notes in Mathematics, 1998, vol. 1675.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%co.mult:OK.
-
-I_alpha = estimate_Ialpha(Y,co);
-H = log(I_alpha) / (1-co.alpha);

code/H_I_D/base_estimators/HRenyi_kNN_k_initialization.m

-function [co] = HRenyi_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Renyi entropy estimator.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_kNN_k';
-    co.mult = mult;
-    
-%other fields:
-    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
-        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
-        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 												
-        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
-        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.         
-		%I:
-            co.kNNmethod = 'knnFP1';
-            co.k = 3;%k-nearest neighbors
-		%II:
-            %co.kNNmethod = 'knnFP2';
-            %co.k = 3;%k-nearest neighbors
-        %III:
-            %co.kNNmethod = 'knnsearch';
-            %co.k = 3;%k-nearest neighbors
-            %co.NSmethod = 'kdtree';
-        %IV:
-            %co.kNNmethod = 'ANN';
-            %co.k = 3;%k-nearest neighbors
-            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
-            
-    co.alpha = 0.95; %The Renyi entropy (H_{R,alpha}) equals to the Shannon differential entropy (H) in limit: H_{R,alpha} -> H, as alpha -> 1.
-    
-%initialize the ann wrapper in Octave, if needed:
-    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_spacing_E_estimation.m

-function [H] = HRenyi_spacing_E_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample) using an extension of the 'empiric entropy estimation of order m' method. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods. 
-%
-%REFERENCE:
-%    	Mark P. Wachowiak,  Renata Smolikova, Georgia D. Tourassi, and Adel S. Elmaghraby. Estimation of generalized entropies with sample spacing. Pattern Analysis and Applications 8: 95-101, 2005.
-%      
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-[d,num_of_samples] = size(Y);
-
-%verification:
-    if d~=1
-        disp('Error: samples must be one-dimensional for this estimator.');
-    end
-    
-m = floor(sqrt(num_of_samples));%m/num_of_samples->0, m,num_of_samples->infty; m<num_of_samples/2
-Y_sorted = sort(Y);
-Y_sorted = [repmat(Y_sorted(1),1,m),Y_sorted,repmat(Y_sorted(end),1,m)];
-d1 = diff(Y_sorted)/2; %(y_(k)-y_(k-1))/2
-    
-diff1 = ev(d1,1,m-1,m);
-diff2 = ev(d1,0,num_of_samples-m,m) + ev(d1,m,num_of_samples,m);
-diff3 = ev(d1,num_of_samples+1-m,num_of_samples-1,m);
-    
-dm = Y_sorted([1+m:num_of_samples+m]+m) - Y_sorted([1-m:num_of_samples-m]+m);
-dm_inv = 2./dm;%2/[y_(k+m)-y_(k-m)]
-    
-idiff1 = cumsum(ev(dm_inv,1-m,-1,m));
-idiff2_temp = ev(dm_inv,1-m,num_of_samples-m,m);%L-sum
-idiff2 = Lsum(idiff2_temp,m);
-idiff3 = fliplr(cumsum(fliplr( ev(dm_inv,num_of_samples+2-2*m,num_of_samples-m,m) )));
-
-term1 = sum(diff1.*(idiff1.^co.alpha));
-term2 = sum(diff2.*(idiff2.^co.alpha));
-term3 = sum(diff3.*(idiff3.^co.alpha));
-    
-H = term1 + term2 + term3;
-H = log(H/num_of_samples^co.alpha) / (1-co.alpha);
-    
-%-------------------
-function [v] = ev(v0,start_idx,end_idx,off)
-%Extraction of the [start_idx:end_idx] coordinates from vector v0, with
-%offset off.
-
-v = v0([start_idx:end_idx]+off);
-
-function[v] = Lsum(v,L)
-%[v_1,...,v_T] -> [v_1+..._v_L,v_2+...+v_{2+L-1},...,v_{K-L+1}...+v_K], K=length(v)
-
-f = filter(ones(1,L),1,v);
-v = f(L:end);
-

code/H_I_D/base_estimators/HRenyi_spacing_E_initialization.m

-function [co] = HRenyi_spacing_E_initialization(mult)
-%Initialization of the Renyi entropy (H) estimator based on the extension of the 'empiric entropy estimation of order m' method.
-%
-%Note:
-%   1)The estimator is treated as a cost object (co).
-%   2)We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
-%
-%INPUT:
-%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
-%OUTPUT:
-%   co: cost object (structure).
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-%mandatory fields:
-    co.name = 'Renyi_spacing_E';
-    co.mult = mult;   
-
-%other fields:
-    co.alpha = 0.99; %The Renyi entropy (H_{R,alpha}) equals to the Shannon differential entropy (H) in limit: H_{R,alpha} -> H, as alpha -> 1.
-            

code/H_I_D/base_estimators/HRenyi_spacing_V_estimation.m

-function [H] = HRenyi_spacing_V_estimation(Y,co)
-%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample) using an extension of Vasicek's spacing method. Cost parameters are provided in the cost object co.
-%
-%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods. 
-%
-%REFERENCE:
-%    	Mark P. Wachowiak,  Renata Smolikova, Georgia D. Tourassi, and Adel S. Elmaghraby. Estimation of generalized entropies with sample spacing. Pattern Analysis and Applications 8: 95-101, 2005.
-%
-%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
-%
-%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
-%
-%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
-%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
-%
-%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
-%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
-%
-%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
-
-[d,num_of_samples] = size(Y);
-
-%verification:
-    if d~=1
-        error('The samples must be one-dimensional for this estimator.');
-    end
-
-m = floor(sqrt(num_of_samples));%m/num_of_samples->0, m,num_of_samples->infty; m<num_of_samples/2
-Y_sorted = sort(Y);
-Y_sorted = [repmat(Y_sorted(1),1,m),Y_sorted,repmat(Y_sorted(end),1,m)];
-diffs = Y_sorted(2*m+1:num_of_samples+2*m) - Y_sorted(1:num_of_samples);
-H = log(mean((num_of_samples / (2*m) * diffs).^(1-co.alpha))) / (1-co.alpha);
-
-
-