Zoltán Szabó avatar Zoltán Szabó committed 94449cf

Exponentiated Jensen-Shannon kernel, Jensen-Tsallis kernel: added.

Comments (0)

Files changed (8)

-v0.42 (September 7, 2013):
+v0.43 (Sep 20, 2013):
+-Exponentiated Jensen-Shannon kernel estimation: added; see 'KEJS_DJS_initialization.m', 'KEJS_DJS_estimation.m'.
+-Jensen-Tsallis kernel estimation: added; see 'KJT_HJT_initialization.m', 'KJT_HJT_estimation.m'.
+-'quick_test_Kpos_semidef.m': changed to cover the 2 new distribution kernel estimators.
+-URL of the embedded FastKICA package: updated to its new location.
+
+v0.42 (Sep 7, 2013):
 
 New:
 ====
 -Note on compiler requirements: added (see doc).
 -ITE_install.m: made to be more user-friendly; detection of the already (i) deleted 'ann_wrapperO'/'ann_wrapperM' directory, (ii) downloaded ARfit package: added.
 
-v0.41 (July 12, 2013):
+v0.41 (Jul 12, 2013):
 -Probability product kernel estimation based on k-nearest neighbors: added; see 'KPP_kNN_k_initialization.m' and 'KPP_kNN_k_estimation.m'.
 -Jensen-Shannon kernel estimation: added; see 'KJS_DJS_initialization.m' and 'KJS_DJS_estimation.m'.
 
-v0.40 (June 23, 2013):
+v0.40 (Jun 23, 2013):
 -Bhattacharyya kernel estimation based on k-nearest neighbors: added; see 'KBhattacharyya_kNN_k_initialization.m' and 'KBhattacharyya_kNN_k_estimation.m'.
 -Expected kernel estimation: added; see 'Kexpected_initialization.m', 'Kexpected_estimation.m'.
 -Directory 'H_I_D_A_C' renamed to 'H_I_D_A_C_K'; 'ITE_install.m' modified accordingly.
 -Kernel on distributions (K) object type: added; see 'K_initialization.m', 'K_estimation.m'. 
 
-v0.39 (June 12, 2013):
+v0.39 (Jun 12, 2013):
 -Symmetric Bregman distance estimation based on nonsymmetric Bregman distance: added; see 'DsymBregman_DBregman_initialization.m', 'DsymBregman_DBregman_estimation.m'.
 -Symmetric Bregman distance estimation based on k-nearest neighbors: added; see 'DsymBregman_kNN_k_initialization.m', 'DsymBregman_kNN_k_estimation.m'. 
 
-v0.38 (June 1, 2013):
+v0.38 (Jun 1, 2013):
 -Jensen-Tsallis divergence estimation: added; see 'DJensenTsallis_HTsallis_initialization.m' and 'DJensenTsallis_HTsallis_estimation.m'.
 -Bregman distance estimation: added; see 'DBregman_kNN_k_initialization.m' and 'DBregman_kNN_k_estimation.m'.
 
 
 **Download** the latest release: 
 
-- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.42_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.42_code.tar.bz2), 
-- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.42_documentation.pdf).
+- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.43_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.43_code.tar.bz2), 
+- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.43_documentation.pdf).
 
 

code/estimators/meta_estimators/KEJS_DJS_estimation.m

+function [K] = KEJS_DJS_estimation(Y1,Y2,co)
+%function [K] = KEJS_DJS_estimation(Y1,Y2,co)
+%Estimates the exponentiated Jensen-Shannon kernel of two distributions from which we have samples (Y1 and Y2) using the relation: K_JS(f_1,f_2) = exp[-u x D_JS(f_1,f_2)], where D_JS is the Jensen-Shannon divergence, u>0.
+%
+%Note:
+%   1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+%   2)This is a meta method: the Jensen-Shannon divergence estimator can be arbitrary.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: estimator object of a kernel on distributions.
+%
+%REFERENCE: 
+%   Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935-975, 2009.
+%   Andre F. T. Martins, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Tsallis kernels on measures. In Information Theory Workshop (ITW), pages 298-302, 2008.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+
+%verification:
+    if size(Y1,1)~=size(Y2,1)
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+K = exp(-co.u * D_estimation(Y1,Y2,co.member_co));

code/estimators/meta_estimators/KEJS_DJS_initialization.m

+function [co] = KEJS_DJS_initialization(mult,post_init)
+%function [co] = KEJS_DJS_initialization(mult)
+%function [co] = KEJS_DJS_initialization(mult,post_init)
+%Initialization of the exponentiated Jensen-Shannon kernel estimator defined according to the relation: K_JS(f_1,f_2) = exp[-u x D_JS(f_1,f_2)], where D_JS is the Jensen-Shannon divergence, u>0.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+%   3)This is a meta method: the Jensen-Shannon divergence estimator can arbitrary.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%   post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%   (instead of their default values). For further details, see 'post_initialization.m'.
+%OUTPUT:
+%   co: cost object (structure).
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields  (following the template structure of the estimators to make uniform usage of the estimators possible):
+    co.name = 'EJS_DJS';
+    co.mult = mult;
+    
+%other fields:
+    co.u = 1; %assumption: u>0
+    co.member_name = 'JensenShannon_HShannon'; %you can change it to any Jensen-Shannon divergence estimator
+
+%post initialization (put it _before_ initialization of the members in case of a meta estimator):    
+    if nargin==2 %there are given (name,value) cost object fields
+        co = post_initialization(co,post_init);
+    end 
+
+%initialization of the member(s):
+    %co.member_co = D_initialization(co.member_name,mult);
+    co.member_co = D_initialization(co.member_name,mult,{'w',[1/2,1/2]}); %{'w',[1/2 1/2]}: the 'w' weight field of the member is also set here; note: if the Jensen-Shannon divergence estimator is restricted to w=[1/2, 1/2], then of course '{...}' can be discarded.

code/estimators/meta_estimators/KJS_DJS_initialization.m

     
 %initialization of the member(s):
     %co.member_co = D_initialization(co.member_name,mult);
-    co.member_co = D_initialization(co.member_name,mult,{'w',[1/2,1/2]}); %{'w',[1/2 1/2]}: the 'w' weight field of the member is also set here; note: if the Jensen-Shannon divergence estimator is restricted to w=[1/2, 1/2], then of course you can leave '{...}'.
+    co.member_co = D_initialization(co.member_name,mult,{'w',[1/2,1/2]}); %{'w',[1/2 1/2]}: the 'w' weight field of the member is also set here; note: if the Jensen-Shannon divergence estimator is restricted to w=[1/2, 1/2], then of course '{...}' can be discarded.
     

code/estimators/meta_estimators/KJT_HJT_estimation.m

+function [K] = KJT_HJT_estimation(Y1,Y2,co)
+%function [K] = KJT_HJT_estimation(Y1,Y2,co)
+%Estimates the Jensen-Tsallis kernel of two distributions from which we have samples (Y1 and Y2) using the relation: K_JT(f_1,f_2) = log_{alpha}(2) - T_alpha(f_1,f_2), where (i) log_{alpha} is the alpha-logarithm, (ii) T_alpha is the Jensen-Tsallis alpha-difference (that can be expressed in terms of the Tsallis entropy).
+%
+%Note:
+%   1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+%   2)This is a meta method: the Tsallis entropy estimator can be arbitrary.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: estimator object of a kernel on distributions.
+%
+%REFERENCE: 
+%   Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935-975, 2009.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+
+%verification:
+    if size(Y1,1)~=size(Y2,1)
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+a = co.alpha; %alpha
+
+%alpha-logarithm of 2:
+    log_alpha_2 = (2^(1-a)-1) / (1-a);
+
+%T (=Jensen-Tsallis alpha-difference):
+    %mixture:    
+        w = [1/2,1/2];%fixed
+        mixtureY = mixture_distribution(Y1,Y2,w);
+    T = H_estimation(mixtureY,co.member_co) - (w(1)^a * H_estimation(Y1,co.member_co) + w(2)^a * H_estimation(Y2,co.member_co)); 
+    
+K = log_alpha_2 - T;

code/estimators/meta_estimators/KJT_HJT_initialization.m

+function [co] = KJT_HJT_initialization(mult,post_init)
+%function [co] = KJT_HJT_initialization(mult)
+%function [co] = KJT_HJT_initialization(mult,post_init)
+%Initialization of the Jensen-Tsallis kernel estimator defined according to the relation: K_JT(f_1,f_2) = log_{alpha}(2) - T_alpha(f_1,f_2), where (i) log_{alpha} is the alpha-logarithm, (ii) T_alpha is the Jensen-Tsallis alpha-difference (that can be expressed in terms of the Tsallis entropy).
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+%   3)This is a meta method: the Tsallis entropy estimator can arbitrary.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%   post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%   (instead of their default values). For further details, see 'post_initialization.m'.
+%OUTPUT:
+%   co: cost object (structure).
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+    co.name = 'JT_HJT';
+    co.mult = mult;
+    
+%other fields:
+    co.alpha = 0.95; %0<=alpha<=2, alpha \ne 1
+    co.member_name = 'Tsallis_kNN_k'; %you can change it to any Tsallis entropy estimator
+
+%post initialization (put it _before_ initialization of the members in case of a meta estimator):    
+    if nargin==2 %there are given (name,value) cost object fields
+        co = post_initialization(co,post_init);
+    end
+
+%initialization of the member(s):
+    co.member_co = H_initialization(co.member_name,mult,{'alpha',co.alpha}); %{'alpha',co.alpha}: the 'alpha' field of the member is also set here
+    
+

code/estimators/quick_tests/quick_test_Kpos_semidef.m

             %cost_name = 'PP_kNN_k';
         %meta:
             %cost_name = 'JS_DJS';
+	        %cost_name = 'EJS_DJS';
+	        %cost_name = 'JT_HJT';
 
 %initialization:    
     num_of_samples_half = floor(num_of_samples/2);
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.