Zoltan Szabo avatar Zoltan Szabo committed ca6860b

Tsallis entropy is now available. A '/'->'*' typo corrected: see 'estimate_Ialpha.m' (V) in 'HRenyi_kNN_k_estimation.m'.

Comments (0)

Files changed (30)

+v0.13 (Oct 27, 2012):
+-Tsallis entropy is now available in ITE; it can be estimated via k-nearest neighbors, see 'HTsallis_kNN_k_initialization.m', 'HTsallis_kNN_k_estimation.m'.
+-A '/'->'*' typo corrected in 'HRenyi_kNN_k_estimation.m'; see 'estimate_Ialpha.m' (V).
+
 v0.12 (Oct 27, 2012):
 -Schweizer-Wolff's sigma and kappa: added; see 'ISW1_initialization.m', 'ISW1_estimation.m', 'ISWinf_initialization.m', 'ISWinf_estimation.m'.
 -Hoeffding's Phi computation: scaled-up (C++); see 'Hoeffding_term1.cpp'.
 - multi-platform (tested extensively on Windows and Linux),
 - free and open source (released under the GNU GPLv3(>=) license).
 
-ITE can estimate Shannon-, Rényi entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, Rényi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa; complex variants of entropy and mutual information; L2-, Rényi-, Tsallis divergence, maximum mean discrepancy, and J-distance.
+ITE can estimate Shannon-, Rényi-, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, Rényi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa; complex variants of entropy and mutual information; L2-, Rényi-, Tsallis divergence, maximum mean discrepancy, and J-distance.
 
 ITE offers solution methods for 
 
 - Independent Subspace Analysis (ISA) and 
 - its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed systems, as well as to systems
 with nonparametric source dynamics. 
+
+Note: become a Follower to be always up-to-date.

code/H_I_D/base_estimators/DL2_kNN_k_estimation.m

-function [L] = DL2_kNN_k_estimation(X,Y,co)
-%Estimates the L2 divergence (L) of X and Y (X(:,t), Y(:,t) is the t^th sample)
+function [D] = DL2_kNN_k_estimation(X,Y,co)
+%Estimates the L2 divergence (D) of X and Y (X(:,t), Y(:,t) is the t^th sample)
 %using the kNN method (S={k}). The number of samples in X [=size(X,2)] and Y [=size(Y,2)] can be different. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 term2 = 2 * (co.k-1) / (num_of_samplesY*c) ./ dist_k_YX.^d;
 term3 = (num_of_samplesX - 1) * c  * (co.k-2) * (co.k-1) / (co.k * (num_of_samplesY*c)^2) * (dist_k_XX.^d ./ dist_k_YX.^(2*d));
 L2 = mean(term1-term2+term3);
-%L = sqrt(L2);%theoretically
-L = sqrt(abs(L2));%due to the finite number of samples L2 can be negative. In this case: sqrt(L2) is complex; to avoid such values we take abs().
+%D = sqrt(L2);%theoretically
+D = sqrt(abs(L2));%due to the finite number of samples L2 can be negative. In this case: sqrt(L2) is complex; to avoid such values we take abs().

code/H_I_D/base_estimators/DL2_kNN_k_initialization.m

 function [co] = DL2_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based L2 divergence (L) estimator.
+%Initialization of the kNN (k-nearest neighbor, S={k}) based L2 divergence estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-            
+            

code/H_I_D/base_estimators/DRenyi_kNN_k_estimation.m

-function [R_alpha] = DRenyi_kNN_k_estimation(X,Y,co)
-%Estimates the Rényi divergence (R_alpha) of X and Y (X(:,t), Y(:,t) is the t^th sample)
+function [D] = DRenyi_kNN_k_estimation(X,Y,co)
+%Estimates the Rényi divergence (D) of X and Y (X(:,t), Y(:,t) is the t^th sample)
 %using the kNN method (S={k}). The number of samples in X [=size(X,2)] and Y [=size(Y,2)] can be different. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 end
 
 D_alpha = estimate_Dalpha(X,Y,co);
-R_alpha = log(D_alpha) / (co.alpha-1);
+D = log(D_alpha) / (co.alpha-1);

code/H_I_D/base_estimators/DRenyi_kNN_k_initialization.m

 function [co] = DRenyi_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Rényi divergence (R_{alpha}) estimator.
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Rényi divergence estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
             
-    co.alpha = 0.99; %The Rényi divergence equals to the Kullback-Leibler divergence in limit, i.e., R_{alpha} -> KL, provided that alpha ->1.
+    co.alpha = 0.99; %The Rényi divergence equals to the Kullback-Leibler divergence in limit, i.e., D_{R,alpha} -> KL, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/DTsallis_kNN_k_estimation.m

-function [T_alpha] = DTsallis_kNN_k_estimation(X,Y,co)
-%Estimates the Tsallis divergence (T_alpha) of X and Y (X(:,t), Y(:,t) is the t^th sample)
+function [D] = DTsallis_kNN_k_estimation(X,Y,co)
+%Estimates the Tsallis divergence (D) of X and Y (X(:,t), Y(:,t) is the t^th sample)
 %using the kNN method (S={k}). The number of samples in X [=size(X,2)] and Y [=size(Y,2)] can be different. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 end
 
 D_alpha = estimate_Dalpha(X,Y,co);
-T_alpha = (D_alpha-1)/(co.alpha-1);
+D = (D_alpha-1)/(co.alpha-1);

code/H_I_D/base_estimators/DTsallis_kNN_k_initialization.m

 function [co] = DTsallis_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Tsallis divergence (T_{alpha}) estimator.
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Tsallis divergence estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
 				
-    co.alpha = 0.99; %The Tsallis divergence equals to the Kullback-Leibler divergence in limit, i.e., T_{alpha} -> KL, provided that alpha ->1.
+    co.alpha = 0.99; %The Tsallis divergence equals to the Kullback-Leibler divergence in limit, i.e., D_{T,alpha} -> KL, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    
+    

code/H_I_D/base_estimators/HRenyi_GSF_estimation.m

-function [H_alpha] = HRenyi_GSF_estimation(Y,co)
-%Estimates the Rényi entropy (H_alpha) of Y (Y(:,t) is the t^th sample)
+function [H] = HRenyi_GSF_estimation(Y,co)
+%Estimates the Rényi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the GSF method. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %W->L (using MatlabBGL); minimal spanning forest, and its weight (L): 
     L = compute_MST(W,co.GSFmethod);
 
-%H_alpha estimation:
-    H_alpha = log(L/num_of_samples.^co.alpha) / (1-co.alpha);
+%estimation:
+    H = log(L/num_of_samples.^co.alpha) / (1-co.alpha);

code/H_I_D/base_estimators/HRenyi_GSF_initialization.m

 function [co] = HRenyi_GSF_initialization(mult)
-%Initialization of the GSF (geodesic spanning forest) based Rényi entropy (H_{alpha}) estimator.
+%Initialization of the GSF (geodesic spanning forest) based Rényi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
     %Method to compute the geodesic spanning forest:
         co.GSFmethod = 'MatlabBGL_Kruskal';
 
-    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_MST_estimation.m

-function [H_alpha] = HRenyi_MST_estimation(Y,co)
-%Estimates the Rényi entropy (H_alpha) of Y (Y(:,t) is the t^th sample)
+function [H] = HRenyi_MST_estimation(Y,co)
+%Estimates the Rényi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the minimum spanning tree (MST). Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
     
 %co.mult:OK.
 
-%H_alpha estimation up to an additive constant:
-    H_alpha = log(L/num_of_samples^(co.alpha)) / (1-co.alpha);
+%estimation up to an additive constant:
+    H = log(L/num_of_samples^(co.alpha)) / (1-co.alpha);
 

code/H_I_D/base_estimators/HRenyi_MST_initialization.m

 function [co] = HRenyi_MST_initialization(mult)
-%Initialization of the minimum spanning tree (MST) based Rényi entropy (H_{alpha}) estimator.
+%Initialization of the minimum spanning tree (MST) based Rényi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
     co.mult = mult;
     
 %other fields:
-    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     %Possibilites for the MST (minimum spanning tree) method:
         co.MSTmethod = 'MatlabBGL_Prim';
         %co.MSTmethod = 'MatlabBGL_Kruskal';

code/H_I_D/base_estimators/HRenyi_kNN_1tok_estimation.m

-function [H_alpha] = HRenyi_kNN_1tok_estimation(Y,co)
-%Estimates the Rényi entropy (H_alpha) of Y (Y(:,t) is the t^th sample)
+function [H] = HRenyi_kNN_1tok_estimation(Y,co)
+%Estimates the Rényi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the kNN method (S={1,...,k}). Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 
 %co.mult:OK.
 
-%H_alpha estimation up to an additive constant:
+%estimation up to an additive constant:
     L = sum(sum(squared_distances.^(gam/2),1));
-    H_alpha = log( L / num_of_samples^co.alpha ) / (1-co.alpha);
+    H = log( L / num_of_samples^co.alpha ) / (1-co.alpha);
 
 
 

code/H_I_D/base_estimators/HRenyi_kNN_1tok_initialization.m

 function [co] = HRenyi_kNN_1tok_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={1,...,k}) based Rényi entropy (H_{alpha}) estimator.
+%Initialization of the kNN (k-nearest neighbor, S={1,...,k}) based Rényi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
 
-    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-    
+    

code/H_I_D/base_estimators/HRenyi_kNN_S_estimation.m

-function [H_alpha] = HRenyi_kNN_S_estimation(Y,co)
-%Estimates the Rényi entropy (H_alpha) of Y (Y(:,t) is the t^th sample) using the generalized k-nearest neighbor (S\subseteq {1,...,k}) method. Cost parameters are provided in the cost object co.
+function [H] = HRenyi_kNN_S_estimation(Y,co)
+%Estimates the Rényi entropy (H) of Y (Y(:,t) is the t^th sample) using the generalized k-nearest neighbor (S\subseteq {1,...,k}) method. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 
 %co.mult:OK.
 
-%H_alpha estimation up to an additive constant:
+%estimation up to an additive constant:
     L = sum(sum(squared_distances(co.k,:).^(p/2),1)); %'p/2' <= squared; S=co.k
-    H_alpha = log( L / num_of_samples^co.alpha) / (1-co.alpha); 
+    H = log( L / num_of_samples^co.alpha) / (1-co.alpha); 

code/H_I_D/base_estimators/HRenyi_kNN_S_initialization.m

 function [co] = HRenyi_kNN_S_initialization(mult)
-%Initialization of the generalized k-nearest neighbor (S\subseteq {1,...,k}) based Rényi entropy (H_{alpha}) estimator.
+%Initialization of the generalized k-nearest neighbor (S\subseteq {1,...,k}) based Rényi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = [1,2,4];%=S: nearest neighbor set
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
 				
-    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_kNN_k_estimation.m

-function [H_alpha] = HRenyi_kNN_k_estimation(Y,co)
-%Estimates the Rényi entropy (H_alpha) of Y (Y(:,t) is the t^th sample)
+function [H] = HRenyi_kNN_k_estimation(Y,co)
+%Estimates the Rényi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the kNN method (S={k}). Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
-[d,num_of_samples] = size(Y);
-squared_distances = kNN_squared_distances(Y,Y,co,1);
+%co.mult:OK.
 
-%co.mult:OK.
-	
-%H_alpha estimation:
-	V = pi^(d/2)*gamma(d/2+1);
-	C = ( gamma(co.k)/gamma(co.k+1-co.alpha) )^(1/(1-co.alpha));
-    s = sum( squared_distances(co.k,:).^(d*(1-co.alpha)/2) ); %'/2' <= squared distances
-	H_alpha = log( (num_of_samples-1) / num_of_samples * V^(1-co.alpha) * C^(1-co.alpha) * s / (num_of_samples-1)^(co.alpha) ) / (1-co.alpha);
-
+I_alpha = estimate_Ialpha(Y,co);
+H = log(I_alpha) / (1-co.alpha);

code/H_I_D/base_estimators/HRenyi_kNN_k_initialization.m

 function [co] = HRenyi_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Rényi entropy (H_{alpha}) estimator.
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Rényi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
             
-    co.alpha = 0.95; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.95; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_weightedkNN_estimation.m

-function [H_alpha] = HRenyi_weightedkNN_estimation(Y,co)
-%Estimates the Rényi entropy (H_alpha) of Y (Y(:,t) is the t^th sample)
+function [H] = HRenyi_weightedkNN_estimation(Y,co)
+%Estimates the Rényi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the weighted k-nearest neighbor method. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
     
     %note gamma(kvec)./gamma(kvec+1-alpha) = beta(kvec,1-alpha)./gamma(1-alpha)
     etahatvec = (beta(kvec,1-co.alpha)./gamma(1-co.alpha)).*mean((((M).*cdunitball.*(kdist.^d))).^(1-co.alpha));
-    H_alpha = (sum(wo .* etahatvec));
-    H_alpha = log(H_alpha) / (1-co.alpha);
+    H = (sum(wo .* etahatvec));
+    H = log(H) / (1-co.alpha);

code/H_I_D/base_estimators/HRenyi_weightedkNN_initialization.m

 function [co] = HRenyi_weightedkNN_initialization(mult)
-%Initialization of the weighted kNN based Rényi entropy (H_{alpha}) estimator.
+%Initialization of the weighted kNN based Rényi entropy (H) estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
     co.mult = mult;
     
 %other fields:
-    co.alpha = 0.95; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.95; %The Rényi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
         %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.
         %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 												 		
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-            
+            

code/H_I_D/base_estimators/HShannon_kNN_k_initialization.m

 function [co] = HShannon_kNN_k_initialization(mult)
-%Initialization of the kNN (k^th nearest neighbor, i.e, S={k}) based Shannon differential entropy (H) estimator.
+%Initialization of the kNN (k^th nearest neighbor, i.e, S={k}) based Shannon differential entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
-            
+            

code/H_I_D/base_estimators/HTsallis_kNN_k_estimation.m

+function [H] = HTsallis_kNN_k_estimation(Y,co)
+%Estimates the Tsallis entropy (H) of Y (Y(:,t) is the t^th sample)
+%using the kNN method (S={k}). Cost parameters are provided in the cost object co.
+%
+%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
+%
+%REFERENCE: 
+%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Rényi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153–2182, 2008.
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK.
+
+I_alpha = estimate_Ialpha(Y,co);
+H = (1-I_alpha) / (co.alpha-1);

code/H_I_D/base_estimators/HTsallis_kNN_k_initialization.m

+function [co] = HTsallis_kNN_k_initialization(mult)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Tsallis entropy estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%OUTPUT:
+%   co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields:
+    co.name = 'Tsallis_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 												
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.         
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors
+		%II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+            
+    co.alpha = 0.95; 
+    
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/meta_estimators/Hcomplex_estimation.m

 function [H] = Hcomplex_estimation(Y,co)
-%Estimates complex entropy from a real-valued (vector) entropy estimator; this a "meta" method, the applied estimator can be arbitrary.
+%Estimates complex entropy (H) from a real-valued (vector) entropy estimator; this a "meta" method, the applied estimator can be arbitrary.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %

code/H_I_D/meta_estimators/Hensemble_estimation.m

 function [H] = Hensemble_estimation(Y,co)
-%Estimates entropy from the average of entropy estimations on groups of samples; this a "meta" method, the applied entropy estimator can be arbitrary.
+%Estimates entropy (H) from the average of entropy estimations on groups of samples; this a "meta" method, the applied entropy estimator can be arbitrary.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 for k = 1 : num_of_groups
     H = H + H_estimation(Y(:,(k-1)*g+1:k*g),co.member_co);
 end
-H = H / num_of_groups;
+H = H / num_of_groups;

code/H_I_D/meta_estimators/IRenyi_HRenyi_estimation.m

-function [I]  = IRenyi_HRenyi_estimation(Y,ds,co)
+function [I_alpha]  = IRenyi_HRenyi_estimation(Y,ds,co)
 %Estimates Rényi mutual information using the formula: "I_{alpha}(Y) = -H_{alpha}(Z)", where Z =[F_1(Y_1);...;F_d(Y_d)] is the copula transformation of Y; F_i is the cdf of Y_i.
 %This is a "meta" method, i.e., the H_{alpha} estimator can be arbitrary.
 %
 
 if one_dimensional_problem(ds)
     Z = copula_transformation(Y);
-    I = -H_estimation(Z,co.member_co);
+    I_alpha = -H_estimation(Z,co.member_co);
 else
     disp('Error: the subspaces must be one-dimensional for this estimator.');
 end

code/H_I_D/meta_estimators/IShannon_HShannon_initialization.m

     
 %other fields:
     co.member_name = 'Shannon_kNN_k'; %you can change it to any Shannon entropy estimator capable of 'multiplicative 
-	%factor correct' H estimation (mult=1) below. Note: Rényi entropy (H_{alpha}) also gives in limit (alpha->1) the Shannon entropy (H).
+	%factor correct' H estimation (mult=1) below. Note: Rényi entropy (H_{R,alpha}) also gives in limit (alpha->1) the Shannon entropy (H).
     co.member_co = H_initialization(co.member_name,1);%'1': since we use the relation '(*)' (multiplicative factors in entropy estimations are NOT allowed).
    
     

code/H_I_D/meta_estimators/Icomplex_estimation.m

 function [I]  = Icomplex_estimation(Y,ds,co)
-%Estimates mutual information using a real-valued (vector) mutual information estimator. This is a "meta" method, i.e, the real mutual information estimator can be arbitrary.
+%Estimates mutual information (I) using a real-valued (vector) mutual information estimator. This is a "meta" method, i.e, the real mutual information estimator can be arbitrary.
 %
 %We make use of the naming convention 'I<name>_estimation', to ease embedding new mutual information estimation methods.
 %
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
-I = I_estimation(C2R_vector(Y),2*ds,co.member_co);
+I = I_estimation(C2R_vector(Y),2*ds,co.member_co);

code/H_I_D/utilities/estimate_Ialpha.m

+function [I_alpha] = estimate_Ialpha(Y,co)
+%Estimates I_alpha = \int p^{\alpha}(y)dy, the Rényi and the Tsallis entropies are simple functions of this quantity. Here, alpha:=co.alpha.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample from the distribution having density p.
+%  co: cost object (structure).
+%
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+[d,num_of_samples] = size(Y);
+squared_distances = kNN_squared_distances(Y,Y,co,1);
+
+V = pi^(d/2)/gamma(d/2+1);%volume of the d-dimensional unit ball
+C = ( gamma(co.k)/gamma(co.k+1-co.alpha) )^(1/(1-co.alpha));
+s = sum( squared_distances(co.k,:).^(d*(1-co.alpha)/2) ); %'/2' <= squared distances
+I_alpha = (num_of_samples-1) / num_of_samples * V^(1-co.alpha) * C^(1-co.alpha) * s / (num_of_samples-1)^(co.alpha);
Add a comment to this file

doc/ITE_documentation.pdf

Binary file modified.

Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.