Commits

Zoltán Szabó committed f93b200

Sharma-Mittal entropy estimation based on (i) k-nearest neighbors (S={k}), and (ii) maximum likehood estimation (MLE) + analytical value in the exponential family: added; see 'HSharmaM_kNN_k_initialization.m', 'HSharmaM_kNN_k_estimation.m', 'HSharmaM_expF_initialization.m', 'HSharmaM_expF_estimation.m'. Quick test for the Sharma-Mittal entropy estimators: added; see 'quick_test_HSharmaM.m'.

  • Participants
  • Parent commits 0d141ab
  • Tags release-0.48

Comments (0)

Files changed (17)

+v0.48 (Nov 11, 2013):
+-Sharma-Mittal entropy estimation based on (i) k-nearest neighbors (S={k}), and (ii) maximum likehood estimation (MLE) + analytical value in the exponential family: added; see 'HSharmaM_kNN_k_initialization.m', 'HSharmaM_kNN_k_estimation.m', 'HSharmaM_expF_initialization.m', 'HSharmaM_expF_estimation.m'.
+-Quick test for the Sharma-Mittal entropy estimators: added; see 'quick_test_HSharmaM.m'.
+-Analytical value for the Sharma-Mittal entropy, MLE, log-normalizer in the exponential family: separately implemented to make the estimation highly modular; see 'analytical_value_HSharmaM.m', 'expF_MLE.m', 'expF_F.m'. 
+
 v0.47 (Nov 1, 2013):
 -Chi-square mutual information estimation based on Pearson chi-square divergence: added; see 'IChiSquare_DChiSquare_initialization', 'IChiSquare_DChiSquare_estimation.m'.
 -Shannon entropy estimation based on an alternative linearly corrected spacing method: added; see 'HShannon_spacing_Vplin2_initialization.m', 'HShannon_spacing_Vplin2_estimation.m'.
 
 ITE can estimate 
 
-- `entropy (H)`: Shannon entropy, R�nyi entropy, Tsallis entropy (Havrda and Charv�t entropy), complex entropy, Phi-entropy (f-entropy),
+- `entropy (H)`: Shannon entropy, R�nyi entropy, Tsallis entropy (Havrda and Charv�t entropy), complex entropy, Phi-entropy (f-entropy), Sharma-Mittal entropy,
 - `mutual information (I)`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information (total correlation, multi-information), L2 mutual information, R�nyi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information, distance covariance, distance correlation, approximate correntropy independence measure, chi-square mutual information (Hilbert-Schmidt norm of the normalized cross-covariance operator, squared-loss mutual information,  mean square contingency), 
 - `divergence (D)`: Kullback-Leibler divergence (relative entropy, I directed divergence), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), J-distance (symmetrised Kullback-Leibler divergence, J divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence, energy distance (specially the Cramer-Von Mises distance), Jensen-Shannon divergence, Jensen-R�nyi divergence, K divergence, L divergence, certain f-divergences (Csisz�r-Morimoto divergence, Ali-Silvey distance), non-symmetric Bregman distance (Bregman divergence), Jensen-Tsallis divergence, symmetric Bregman distance, Pearson chi square divergence (chi square distance),
 - `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
 
 **Download** the latest release: 
 
-- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.47_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.47_code.tar.bz2), 
-- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.47_documentation.pdf).
+- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.48_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.48_code.tar.bz2), 
+- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.48_documentation.pdf).
 
 

code/estimators/base_estimators/DBhattacharyya_kNN_k_estimation.m

 %  co: divergence estimator object.
 %
 %REFERENCE: 
-%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302"
+%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (estimation of Dtemp2)
 
 %Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
 %

code/estimators/base_estimators/DChiSquare_kNN_k_estimation.m

 %  co: divergence estimator object.
 %
 %REFERENCE: 
-%   Barnabas Poczos, Liang Xiong, Dougal Sutherland, and Jeff Schneider. Support distribution machines. Technical Report, Carnegie Mellon University, 2012. http://arxiv.org/abs/1202.0302. (estimation: Dtemp2 below)
+%   Barnabas Poczos, Liang Xiong, Dougal Sutherland, and Jeff Schneider. Support distribution machines. Technical Report, Carnegie Mellon University, 2012. http://arxiv.org/abs/1202.0302. (estimation of Dtemp2)
 %   Karl Pearson. On the criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling. Philosophical Magazine Series, 50:157-172, 1900.
 
 %Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")

code/estimators/base_estimators/DHellinger_kNN_k_estimation.m

 %  co: divergence estimator object.
 %
 %REFERENCE: 
-%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (estimation)
+%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (Dtemp2 estimation)
 
 %Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
 %

code/estimators/base_estimators/HSharmaM_expF_estimation.m

+function [H] = HSharmaM_expF_estimation(Y,co)
+%function [H] = HSharmaM_expF_estimation(Y,co)
+%Estimates the Sharma-Mittal entropy (H) of Y using maximum likelihood estimation (MLE) + analytical formula corresponding in the chosen exponential family.
+%
+%We use the naming convention 'H<name>_estimation' to ease embedding new entropy estimation methods.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  co: entropy estimator object.
+%
+%REFERENCE: 
+%    Frank Nielsen and Richard Nock. A closed-form expression for the Sharma-Mittal entropy of exponential families. Journal of Physics A: Mathematical and Theoretical, 45:032003, 2012. (analytical formulas for the Sharma-Mittal entropy in the exponential family)
+%    Ethem Akturk, Baris Bagci, and Ramazan Sever. Is Sharma-Mittal entropy really a step beyond Tsallis and Renyi entropies? Technical report, 2007. http://arxiv.org/abs/cond-mat/0703277. (Sharma-Mittal entropy)
+%    Bhudev D. Sharma and Dharam P. Mittal. New nonadditive measures of inaccuracy. Journal of Mathematical Sciences, 10:122-133, 1975. (Sharma-Mittal entropy)
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+
+%MLE:
+  np = expF_MLE(Y,co.distr);
+  
+%I_alpha (analytical formula):  
+  F1 = expF_F(co.distr,expF_np_mult(np,co.alpha)); %F(alpha*theta)
+  F2 = co.alpha * expF_F(co.distr,np); %alpha*F(theta)
+  I_alpha = exp(F1-F2); %assumption: k, the carrier measure is zero
+    
+H = ( I_alpha^( (1-co.beta)/(1-co.alpha) )  - 1 )/ (1 - co.beta);

code/estimators/base_estimators/HSharmaM_expF_initialization.m

+function [co] = HSharmaM_expF_initialization(mult,post_init)
+%function [co] = HSharmaM_expF_initialization(mult)
+%function [co] = HSharmaM_expF_initialization(mult,post_init)
+%Initialization of the exponential family based Sharma-Mittal entropy estimator (maximum likelihood + analytical formula in the chosen exponential family.).
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'H<name>_initialization' to ease embedding new entropy estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%   post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%   (instead of their default values). For further details, see 'post_initialization.m'.
+%OUTPUT:
+%   co: cost object (structure).
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+    co.name = 'SharmaM_expF';
+    co.mult = mult;
+    
+%other fields:
+    co.distr = 'normal'; %exponential family used for estimation; fixed
+    
+    co.alpha = 0.8; %alpha: >0, not 1. 
+    co.beta = 0.6; %beta: not 1.
+    %The Sharma-Mittal entropy (H_{SM,alpha,beta}) equals to the 
+    %  1)Renyi entropy (H_{R,alpha}): H_{SM,alpha,beta} -> H_{R,alpha}, as beta -> 1.
+    %  2)Tsallis entropy (H_{T,alpha}): H_{SM,alpha,beta} -> H_{T,alpha}, as beta -> alpha.
+    %  3)Shannon entropy (H): H_{SM,alpha,beta} -> H, as (alpha,beta) -> (1,1).
+
+%post initialization (put it _before_ initialization of the members in case of a meta estimator):    
+    if nargin==2 %there are given (name,value) cost object fields
+        co = post_initialization(co,post_init);
+    end    

code/estimators/base_estimators/HSharmaM_kNN_k_estimation.m

+function [H] = HSharmaM_kNN_k_estimation(Y,co)
+%function [H] = HSharmaM_kNN_k_estimation(Y,co)
+%Estimates the Sharma-Mittal entropy (H) of Y using the kNN method (S={k}).
+%
+%We use the naming convention 'H<name>_estimation' to ease embedding new entropy estimation methods.
+%
+%INPUT:
+%   Y: Y(:,t) is the t^th sample.
+%  co: entropy estimator object.
+%
+%REFERENCE: 
+%    Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153-2182, 2008. (I_alpha estimation)
+%    Joseph E. Yukich. Probability Theory of Classical Euclidean Optimization Problems, Lecture Notes in Mathematics, 1998, vol. 1675. (I_alpha estimation)
+%    Ethem Akturk, Baris Bagci, and Ramazan Sever. Is Sharma-Mittal entropy really a step beyond Tsallis and Renyi entropies? Technical report, 2007. http://arxiv.org/abs/cond-mat/0703277. (Sharma-Mittal entropy)
+%    Bhudev D. Sharma and Dharam P. Mittal. New nonadditive measures of inaccuracy. Journal of Mathematical Sciences, 10:122-133, 1975. (Sharma-Mittal entropy)
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+
+I_alpha = estimate_Ialpha(Y,co);
+H = ( I_alpha^( (1-co.beta)/(1-co.alpha) )  - 1 )/ (1 - co.beta);

code/estimators/base_estimators/HSharmaM_kNN_k_initialization.m

+function [co] = HSharmaM_kNN_k_initialization(mult,post_init)
+%function [co] = HSharmaM_kNN_k_initialization(mult)
+%function [co] = HSharmaM_kNN_k_initialization(mult,post_init)
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Sharma-Mittal entropy estimator.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'H<name>_initialization' to ease embedding new entropy estimation methods.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%   post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%   (instead of their default values). For further details, see 'post_initialization.m'.
+%OUTPUT:
+%   co: cost object (structure).
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+    co.name = 'SharmaM_kNN_k';
+    co.mult = mult;
+    
+%other fields:
+    %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
+        %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.                
+        %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 												
+        %III: 'knnsearch' (Matlab Statistics Toolbox): parameters: co.k, co.NSmethod ('kdtree' or 'exhaustive').
+        %IV: 'ANN' (approximate nearest neighbor); parameters: co.k, co.epsi.         
+		%I:
+            co.kNNmethod = 'knnFP1';
+            co.k = 3;%k-nearest neighbors
+		%II:
+            %co.kNNmethod = 'knnFP2';
+            %co.k = 3;%k-nearest neighbors
+        %III:
+            %co.kNNmethod = 'knnsearch';
+            %co.k = 3;%k-nearest neighbors
+            %co.NSmethod = 'kdtree';
+        %IV:
+            %co.kNNmethod = 'ANN';
+            %co.k = 3;%k-nearest neighbors
+            %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
+            
+    co.alpha = 0.8; %alpha: >0, not 1. 
+    co.beta = 0.6; %beta: not 1.
+    %The Sharma-Mittal entropy (H_{SM,alpha,beta}) equals to the 
+    %  1)Renyi entropy (H_{R,alpha}): H_{SM,alpha,beta} -> H_{R,alpha}, as beta -> 1.
+    %  2)Tsallis entropy (H_{T,alpha}): H_{SM,alpha,beta} -> H_{T,alpha}, as beta -> alpha.
+    %  3)Shannon entropy (H): H_{SM,alpha,beta} -> H, as (alpha,beta) -> (1,1).
+    
+%initialize the ann wrapper in Octave, if needed:
+    initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);
+
+%post initialization (put it _before_ initialization of the members in case of a meta estimator):    
+    if nargin==2 %there are given (name,value) cost object fields
+        co = post_initialization(co,post_init);
+    end    

code/estimators/base_estimators/KBhattacharyya_kNN_k_estimation.m

 %  co: estimator object of a kernel on distributions.
 %
 %REFERENCE: 
-%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (k-nearest neighbor based estimation)
+%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (k-nearest neighbor based estimation of Dtemp2)
 %   Tony Jebara, Risi Kondor, and Andrew Howard. Probability product kernels. Journal of Machine Learning Research, 5:819-844, 2004. (probability product kernels --spec--> Bhattacharyya kernel)
 %   Anil K. Bhattacharyya. On a measure of divergence between two statistical populations defined by their probability distributions. Bulletin of the Calcutta Mathematical Society, 35:99-109, 1943. (Bhattacharyya kernel)
 

code/estimators/base_estimators/KPP_kNN_k_estimation.m

 %  co: estimator object of a kernel on distributions.
 %
 %REFERENCE: 
-%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (k-nearest neighbor based estimation)
+%   Barnabas Poczos and Liang Xiong and Dougal Sutherland and Jeff Schneider. Support Distribution Machines. Technical Report, 2012. "http://arxiv.org/abs/1202.0302" (k-nearest neighbor based estimation of Dtemp2)
 %   Tony Jebara, Risi Kondor, and Andrew Howard. Probability product kernels. Journal of Machine Learning Research, 5:819-844, 2004. (probability product kernels --spec--> Bhattacharyya kernel)
 
 %Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")

code/estimators/quick_tests/analytical_values/analytical_value_HSharmaM.m

+function [H] = analytical_value_HSharmaM(distr,alpha_H,beta_H,par)
+%function [H] = analytical_value_HSharmaM(distr,alpha_H,beta_H,par)
+%Analytical value (H) of the Sharma-Mittal entropy for the given distribution.
+%
+%INPUT:
+%   distr  : name of the distribution; 'normal'.
+%   par    : parameters of the distribution (structure),
+%            distr = 'normal': par.cov_mtx = covariance matrix.
+%   alpha_H: parameter of the Sharma-Mittal entropy; alpha: >0, not 1.
+%   beta_H : parameter of the Sharma-Mittal entropy; beta: not 1.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+switch distr
+    case 'normal' %[Frank Nielsen and Richard Nock. A closed-form expression for the Sharma-Mittal entropy of exponential families. Journal of Physics A: Mathematical and Theoretical, 45:032003, 2012.]
+        cov_mtx = par.cov_mtx;
+        d = size(cov_mtx,1); %=size(cov_mtx,2)
+        H = ( ( (2*pi)^(d/2) * sqrt(abs(det(cov_mtx))) )^(1-beta_H) / alpha_H^( d*(1-beta_H) / (2*(1-alpha_H)) )  - 1 ) / (1-beta_H);
+    otherwise
+        error('Distribution=?');                 
+end

code/estimators/quick_tests/quick_test_HSharmaM.m

+%function [] = quick_test_HSharmaM()
+%Quick test for Sharma-Mittal entropy estimators: analytical expression vs estimated value as a function of the sample number. In the test, normal variables are considered.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%clear start:
+    clear all; close all;
+
+%parameters:
+    distr = 'normal'; %fixed
+    d = 1; %dimension of the distribution
+    alpha_H = 0.8; %parameter of the Sharma-Mittal entropy; alpha: >0, not 1.
+    beta_H = 0.6; %parameter of the Sharma-Mittal entropy; beta: not 1.
+    
+    num_of_samples_v = [1000:1000:30*1000]; %sample numbers used for estimation
+    %estimator (of Sharma-Mittal entropy), base:
+        cost_name = 'SharmaM_kNN_k'; %d>=1
+        %cost_name = 'SharmaM_expF';   %d>=1
+    
+%initialization:
+    num_of_samples_max = num_of_samples_v(end);
+    L = length(num_of_samples_v);
+    %initialize cost object, set the alpha parameter:
+        co = H_initialization(cost_name,1,{'alpha',alpha_H,'beta',beta_H}); %{'alpha',alpha_H,'beta',beta_H}: set the 'alpha' and 'beta' fields
+    H_hat_v = zeros(L,1); %vector of estimated entropies
+
+%distr, d -> samples (Y), analytical formula for the entropy (H):
+    switch distr
+        case 'normal'
+            %expectation:
+                e = rand(d,1);
+            %random linear transformation applied to N(0,I):
+                A = rand(d); 
+                %A = eye(d); %do not transform the data
+            %covariance matrix:
+                cov_mtx = A * A.';
+            %generate samples:
+                Y = A * randn(d,num_of_samples_max) + repmat(e,1,num_of_samples_max); %AxN(0,I)+e
+            %analytical value of the Sharma-Mittal entropy:
+                par.cov_mtx = cov_mtx;
+                H = analytical_value_HSharmaM(distr,alpha_H,beta_H,par);
+        otherwise
+            error('Distribution=?');                 
+    end
+    
+%estimation:
+    Tk = 0;%index of the sample number examined   
+    for num_of_samples = num_of_samples_v
+        Tk = Tk + 1;
+        H_hat_v(Tk) = H_estimation(Y(:,1:num_of_samples),co);
+        disp(strcat('Tk=',num2str(Tk),'/',num2str(L)));
+    end
+    
+%plot:
+    plot(num_of_samples_v,H_hat_v,'r',num_of_samples_v,H*ones(L,1),'g');
+    legend({'estimation','analytical value'});
+    xlabel('Number of samples');
+    ylabel('Sharma-Mittal entropy');
+ 

code/estimators/quick_tests/quick_test_IRenyi.m

     for num_of_samples = num_of_samples_v
         Tk = Tk + 1;
         I_hat_v(Tk) = I_estimation(Y(:,1:num_of_samples),ones(d,1),co);
+        disp(strcat('Tk=',num2str(Tk),'/',num2str(L)));
     end
     
 %plot:

code/estimators/utilities/expF_F.m

+function [F] = expF_F(distr,np) 
+%function [F] = expF_F(distr,np) 
+%Computes the log-normalizer (F) at a given natural parameter value (np) for the input exponential family.
+%
+%INPUT:
+%   distr: 'normal'.
+%   np   : natural parameters.
+%          distr = 'normal': np.t1 = C^{-1}*m, np.t2 = 1/2*C^{-1}, where m is the mean, C is the covariance matrix.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+switch distr
+    case 'normal'
+        d = length(np.t1);
+        F = trace(inv(np.t2) * np.t1 * np.t1.') /4 - log(det(np.t2)) / 2 + d * log(pi) / 2;
+    otherwise
+       error('Distribution=?');   
+end

code/estimators/utilities/expF_MLE.m

+function [np] = expF_MLE(Y,distr)
+%function [np] = expF_MLE(Y,distr)
+%Maximum likelihood estimation with the given exponential family on data Y.
+%
+%INPUT:
+%   distr: 'normal'.
+%   Y    : data, Y(:,t) is the t^{th} sample.
+%OUTPUT:
+%   np: estimated natural parameters.
+%       distr = 'normal': np.t1 = C^{-1}*m, np.t2 = 1/2*C^{-1}, where m is the mean, C is the covariance matrix.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+switch distr
+    case 'normal'
+        m = mean(Y,2);
+        C = cov(Y.');
+        invC = inv(C);
+        np.t1 = invC * m;
+        np.t2 = invC / 2;
+    otherwise
+       error('Distribution=?');                 
+end

code/estimators/utilities/expF_np_mult.m

+function [np] = expF_np_mult(np,a)
+%function [np] = expF_np_mult(np,a)
+%Multiplies each natural parameter in structure 'np' by the scalar 'a'.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+F = fieldnames(np);
+for nF = 1 : length(F)
+    aF = F{nF};
+    np.(aF)  = np.(aF) * a;
+end