Commits

Zoltán Szabó committed 2be4fc2

Exponentiated Jensen-Tsallis kernel estimators: added; see 'KEJT1_HT_initialization.m', 'KEJT1_HT_estimation.m', 'KEJT2_DJT_initialization.m', 'KEJT2_DJT_estimation.m'. Quick tests covering the estimators: introduced ('quick_test_EJT1.m', 'quick_test_KEJT2.m'), updated accordingly ('quick_test_Kpos_semidef.m').

Comments (0)

Files changed (10)

+v0.45 (Oct 9, 2013):
+-Exponentiated Jensen-Tsallis kernel estimators based on Tsallis entropy and Jensen-Tsallis divergence: added; see 'KEJT1_HT_initialization.m', 'KEJT1_HT_estimation.m', 'KEJT2_DJT_initialization.m', 'KEJT2_DJT_estimation.m'.
+-Quick tests for the exponentiated Jensen-Tsallis kernel estimators: added, see 'quick_test_EJT1.m', 'quick_test_KEJT2.m'. 'quick_test_Kpos_semidef.m': changed to cover the 2 new distribution kernel estimators.
+
 v0.44 (Oct 1, 2013):
--Exponentiated Jensen-Renyi kernel estimator based on Renyi entropy and Jensen-Renyi divergence: added; see 'KEJR1_HR_initialization.m', 'KEJR1_HR_estimation.m', 'KEJR2_DJR_initialization.m', 'KEJR2_DJR_estimation.m'.
+-Exponentiated Jensen-Renyi kernel estimators based on Renyi entropy and Jensen-Renyi divergence: added; see 'KEJR1_HR_initialization.m', 'KEJR1_HR_estimation.m', 'KEJR2_DJR_initialization.m', 'KEJR2_DJR_estimation.m'.
 -Quick tests for the exponentiated Jensen-Renyi kernel estimators: added, see 'quick_test_EJR1.m', 'quick_test_KEJR2.m'. 'quick_test_Kpos_semidef.m': changed to cover the 2 new distribution kernel estimators.
 
 v0.43 (Sep 20, 2013):
 - `divergence (D)`: Kullback-Leibler divergence (relative entropy, I directed divergence), L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance), J-distance (symmetrised Kullback-Leibler divergence, J divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence, energy distance (specially the Cramer-Von Mises distance), Jensen-Shannon divergence, Jensen-R�nyi divergence, K divergence, L divergence, certain f-divergences (Csisz�r-Morimoto divergence, Ali-Silvey distance), non-symmetric Bregman distance (Bregman divergence), Jensen-Tsallis divergence, symmetric Bregman distance,
 - `association measures (A)`, including `measures of concordance`: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,
 - `cross quantities (C)`: cross-entropy,
-- `kernels on distributions (K)`: expected kernel, Bhattacharyya kernel, probability product kernel, Jensen-Shannon kernel, exponentiated Jensen-Shannon kernel, Jensen-Tsallis kernel, exponentiated Jensen-Renyi kernel.
+- `kernels on distributions (K)`: expected kernel, Bhattacharyya kernel, probability product kernel, Jensen-Shannon kernel, exponentiated Jensen-Shannon kernel, Jensen-Tsallis kernel, exponentiated Jensen-Renyi kernel(s), exponentiated Jensen-Tsallis kernel(s).
 
 ITE offers solution methods for 
 
 
 **Download** the latest release: 
 
-- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.44_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.44_code.tar.bz2), 
-- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.44_documentation.pdf).
+- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.45_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.45_code.tar.bz2), 
+- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.45_documentation.pdf).
 
 

code/estimators/meta_estimators/KEJT1_HT_estimation.m

+function [K] = KEJT1_HT_estimation(Y1,Y2,co)
+%function [K] = KEJT1_HT_estimation(Y1,Y2,co)
+%Estimates the exponentiated Jensen-Tsallis kernel-1 of two distributions from which we have samples (Y1 and Y2) using the relation: 
+%K_EJT1(f_1,f_2) = exp[-u x H_T((y^1+y^2)/2)], where H_R is the Tsallis entropy, (y^1+y^2)/2 is the mixture of y^1~f_1 and y^2~f_2 with 1/2-1/2 weights, u>0.
+%
+%%Note:
+%   1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+%   2)This is a meta method: the Tsallis entropy estimator can be arbitrary.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: estimator object of a kernel on distributions.
+%
+%REFERENCE: 
+%   Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935-975, 2009.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+
+%verification:
+    if size(Y1,1)~=size(Y2,1)
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+%mixture:    
+    w = [1/2,1/2];
+    mixtureY = mixture_distribution(Y1,Y2,w);
+
+K = exp(-co.u * H_estimation(mixtureY,co.member_co));

code/estimators/meta_estimators/KEJT1_HT_initialization.m

+function [co] = KEJT1_HT_initialization(mult,post_init)
+%function [co] = KEJT1_HT_initialization(mult)
+%function [co] = KEJT1_HT_initialization(mult,post_init)
+%Initialization of the exponentiated Jensen-Tsallis kernel-1 estimator defined according to the relation: 
+%K_EJT1(f_1,f_2) = exp[-u x H_T((y^1+y^2)/2)], where H_T is the Tsallis entropy, (y^1+y^2)/2 is the mixture of y^1~f_1 and y^2~f_2 with 1/2-1/2 weights, u>0.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+%   3)This is a meta method: the Tsallis entropy estimator can arbitrary.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%   post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%   (instead of their default values). For further details, see 'post_initialization.m'.
+%OUTPUT:
+%   co: cost object (structure).
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+    co.name = 'EJT1_HT';
+    co.mult = mult;
+    
+%other fields:
+    co.u = 0.1; %assumption: u>0
+    co.alpha = 0.95; %assumption: 0<=alpha<=2, alpha \ne 1
+    co.member_name = 'Tsallis_kNN_k'; %you can change it to any Tsallis entropy estimator
+
+%post initialization (put it _before_ initialization of the members in case of a meta estimator):    
+    if nargin==2 %there are given (name,value) cost object fields
+        co = post_initialization(co,post_init);
+    end
+
+%initialization of the member(s):
+    co.member_co = H_initialization(co.member_name,mult,{'alpha',co.alpha}); %{'alpha',co.alpha}: the 'alpha' field of the member is also set here
+    

code/estimators/meta_estimators/KEJT2_DJT_estimation.m

+function [K] = KEJT2_DJT_estimation(Y1,Y2,co)
+%function [K] = KEJT2_DJT_estimation(Y1,Y2,co)
+%Estimates the exponentiated Jensen-Tsallis kernel-2 of two distributions from which we have samples (Y1 and Y2) using the relation: 
+%K_EJT2(f_1,f_2) = exp[-u x D_JT(f_1,f_2)], where D_JT is the Jensen-Tsallis divergence, u>0.
+%
+%%Note:
+%   1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.
+%   2)This is a meta method: the Jensen-Tsallis divergence estimator can be arbitrary.
+%
+%INPUT:
+%  Y1: Y1(:,t) is the t^th sample from the first distribution.
+%  Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.
+%  co: estimator object of a kernel on distributions.
+%
+%REFERENCE: 
+%   Andre F. T. Martins, Noah A. Smith, Eric P. Xing, Pedro M. Q. Aguiar, and Mario A. T. Figueiredo. Nonextensive information theoretical kernels on measures. Journal of Machine Learning Research, 10:935-975, 2009.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%co.mult:OK. The information theoretical quantity of interest can be (and is!) estimated exactly [co.mult=1]; the computational complexity of the estimation is essentially the same as that of the 'up to multiplicative constant' case [co.mult=0].
+
+%verification:
+    if size(Y1,1)~=size(Y2,1)
+        error('The dimension of the samples in Y1 and Y2 must be equal.');
+    end
+
+K = exp(-co.u * D_estimation(Y1,Y2,co.member_co));

code/estimators/meta_estimators/KEJT2_DJT_initialization.m

+function [co] = KEJT2_DJT_initialization(mult,post_init)
+%function [co] = KEJT2_DJT_initialization(mult)
+%function [co] = KEJT2_DJT_initialization(mult,post_init)
+%Initialization of the exponentiated Jensen-Tsallis kernel-2 estimator defined according to the relation: 
+%K_EJT2(f_1,f_2) = exp[-u x D_JT(f_1,f_2)], where D_JT is the Jensen-Tsallis divergence, u>0.
+%
+%Note:
+%   1)The estimator is treated as a cost object (co).
+%   2)We use the naming convention 'K<name>_initialization' to ease embedding new estimators for kernels on distributions.
+%   3)This is a meta method: the Jensen-Tsallis divergence estimator can arbitrary.
+%
+%INPUT:
+%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
+%   post_init: {field_name1,field_value1,field_name2,field_value2,...}; cell array containing the names and the values of the cost object fields that are to be used
+%   (instead of their default values). For further details, see 'post_initialization.m'.
+%OUTPUT:
+%   co: cost object (structure).
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%mandatory fields (following the template structure of the estimators to make uniform usage of the estimators possible):
+    co.name = 'EJT2_DJT';
+    co.mult = mult;
+    
+%other fields:
+    co.u = 1; %assumption: u>0
+    co.alpha = 0.95; %assumption: 0<=alpha<=2, alpha \ne 1
+    co.member_name = 'JensenTsallis_HTsallis'; %you can change it to any Jensen-Tsallis divergence estimator
+
+%post initialization (put it _before_ initialization of the members in case of a meta estimator):    
+    if nargin==2 %there are given (name,value) cost object fields
+        co = post_initialization(co,post_init);
+    end 
+
+%initialization of the member(s):
+    co.member_co = D_initialization(co.member_name,mult,{'alpha',co.alpha}); %{'alpha',co.alpha}: the 'alpha' field of the member is also set here
+

code/estimators/quick_tests/quick_test_KEJR2.m

     co = K_initialization(cost_name,1,{'alpha',2,'u',u}); %{'alpha',2,'u',u}:set the 'alpha' and 'u' fields. Note: there exist explicit formula in case of alpha = 2 for Jensen-Renyi divergence => to the Jensen-Renyi kernel too.
     num_of_samples_max = num_of_samples_v(end);
     K_hat_v = zeros(L,1); %vector of the estimated Jensen-Renyi kernel
-    w = [1/2;1/2];%parameter of Jensen-Renyi divergence, fixed
     
 %distr, d -> samples (Y1,Y2), analytical formula for the Jensen-Renyi kernel (K):
     switch distr    
                 Y2 = randn(d,num_of_samples_max) * s2 + repmat(m2,1,num_of_samples_max);
 
   	   %analytical value of Jensen-Renyi divergence (ref.: Fei Wang, Tanveer Syeda-Mahmood, Baba C. Vemuri, David Beymer, and Anand Rangarajan. Closed-Form Jensen-Renyi Divergence for Mixture of Gaussians and Applications to Group-Wise Shape Registration. Medical Image Computing and Computer-Assisted Intervention, 12: 648–655, 2009.)
+                w = [1/2;1/2];%parameter of Jensen-Renyi divergence, fixed
 	        ms = [m1,m2];
 	        ss = [s1,s2];
 	        term1 = compute_H2(w,ms,ss);

code/estimators/quick_tests/quick_test_KEJT1.m

+%function [] = quick_test_KEJT1()
+%Quick test for Jensen-Tsallis kernel-1 estimators: analytical expression vs estimated value as a function of the sample number. In the test, normal variables are considered.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%clear start:
+    clear all; close all;
+
+%parameters:
+    distr = 'normal'; %fixed
+    d = 1; %dimension of the distribution
+    u_K = 0.8;%>0, parameter of the Jensen-Tsallis kernel
+    num_of_samples_v = [1000:2000:50*1000]; %sample numbers used for estimation
+    %estimator, meta:
+        cost_name = 'EJT1_HT';
+    
+%initialization:
+    num_of_samples_max = num_of_samples_v(end);
+    L = length(num_of_samples_v);
+    alpha_K = 2; %\in [0,2]\{1}; parameter of the Jensen-Tsallis kernel; fixed; for alpha_K = 2 we have explicit formula for the Renyi-entropy => for the Jensen-Tsallis kernel    
+    %initialize cost object, set the alpha and the  parameter:
+        co = K_initialization(cost_name,1,{'alpha',alpha_K,'u',u_K}); %{'alpha',alpha_K,'u',u_K}: set the 'alpha' field
+    K_hat_v = zeros(L,1); %vector of estimated kernel values
+    
+%distr, d -> samples (Y1,Y2), analytical formula for the Jensen-Tsallis kernel (K):
+    switch distr
+        case 'normal'
+            %generate samples (Y1,Y2):
+                %Y1~N(m1,s1^2xI), Y2~N(m2,s2^2xI):
+                   m1 = randn(d,1);  s1 = rand;
+                   m2 = randn(d,1);  s2 = rand;
+                Y1 = randn(d,num_of_samples_max) * s1 + repmat(m1,1,num_of_samples_max);
+                Y2 = randn(d,num_of_samples_max) * s2 + repmat(m2,1,num_of_samples_max);
+            %analytical value of Renyi entropy (ref.: Fei Wang, Tanveer Syeda-Mahmood, Baba C. Vemuri, David Beymer, and Anand Rangarajan. Closed-Form Jensen-Renyi Divergence for Mixture of Gaussians and Applications to Group-Wise Shape Registration. Medical Image Computing and Computer-Assisted Intervention, 12: 648–655, 2009.)
+    	        ms = [m1,m2];
+    	        ss = [s1,s2];
+                HR = compute_H2([1/2,1/2],ms,ss);%quadratic Renyi entropy
+            %quadratic Renyi entropy -> quadratic Tsallis entropy:
+                HT = 1- exp(-HR);
+            K = exp(-u_K*HT);
+        otherwise
+            error('Distribution=?');                 
+    end
+    
+%estimation:
+    Tk = 0;%index of the sample number examined   
+    for num_of_samples = num_of_samples_v
+        Tk = Tk + 1;
+        K_hat_v(Tk) = K_estimation(Y1(:,1:num_of_samples),Y2(:,1:num_of_samples),co);
+        disp(strcat('Tk=',num2str(Tk),'/',num2str(L)));
+    end
+    
+%plot:
+    plot(num_of_samples_v,K_hat_v,'r',num_of_samples_v,K*ones(L,1),'g');
+    legend({'estimation','analytical value'});
+    xlabel('Number of samples');
+    ylabel('Jensen-Tsallis kernel-1');
+ 

code/estimators/quick_tests/quick_test_KEJT2.m

+%function [] = quick_test_KEJT2()
+%Quick test for exponentiated Jensen-Tsallis kernel-2 estimators: analytical expression vs estimated value as a function of the sample number. In the test, normal variables are considered.
+
+%Copyright (C) 2013 Zoltan Szabo ("http://www.gatsby.ucl.ac.uk/~szabo/", "zoltan (dot) szabo (at) gatsby (dot) ucl (dot) ac (dot) uk")
+%
+%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
+%
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+
+%clear start:
+    clear all; close all;
+    
+%parameters:    
+    distr = 'normal'; %fixed
+    d = 1; %dimension of the distribution
+    num_of_samples_v = [100:500:12*1000]; %sample numbers used for estimation
+    u = 0.8; %>0, parameter of the Jensen-Tsallis kernel
+    %estimator, meta:    
+    	cost_name = 'EJT2_DJT';
+        
+%initialization:
+    L = length(num_of_samples_v);
+    co = K_initialization(cost_name,1,{'alpha',2,'u',u}); %{'alpha',2,'u',u}:set the 'alpha' and 'u' fields. Note: there exist explicit formula in case of alpha = 2 for Jensen-Renyi divergence => to the Jensen-Renyi kernel too.
+    num_of_samples_max = num_of_samples_v(end);
+    K_hat_v = zeros(L,1); %vector of the estimated Jensen-Tsallis kernel
+    
+%distr, d -> samples (Y1,Y2), analytical formula for the Jensen-Tsallis kernel (K):
+    switch distr    
+        case 'normal'
+            %generate samples (Y1,Y2):
+                %Y1~N(m1,s1^2xI), Y2~N(m2,s2^2xI):
+                   m1 = randn(d,1);  s1 = rand;
+                   m2 = randn(d,1);  s2 = rand;
+                Y1 = randn(d,num_of_samples_max) * s1 + repmat(m1,1,num_of_samples_max);
+                Y2 = randn(d,num_of_samples_max) * s2 + repmat(m2,1,num_of_samples_max);
+
+  	   %analytical value of Jensen-Renyi divergence (ref.: Fei Wang, Tanveer Syeda-Mahmood, Baba C. Vemuri, David Beymer, and Anand Rangarajan. Closed-Form Jensen-Renyi Divergence for Mixture of Gaussians and Applications to Group-Wise Shape Registration. Medical Image Computing and Computer-Assisted Intervention, 12: 648–655, 2009.)
+            w = [1/2;1/2];%parameter of Jensen-Tsallis divergence, fixed
+	        ms = [m1,m2];
+	        ss = [s1,s2];
+	        term1 = 1 - exp(-compute_H2(w,ms,ss));%quadratic Renyi entropy -> quadratic Tsallis entropy
+	        term2 = w(1) * (1-exp(-compute_H2(1,m1,s1))) + w(2) * (1-exp(-compute_H2(1,m2,s2)));%quadratic Renyi entropy -> quadratic Tsallis entropy
+	        D  = term1 - term2; %H2(\sum_i wi Yi) - \sum_i w_i H2(Yi), where H2 is the quadratic Tsallis entropy
+        %analytical value of Jensen-Tsallis kernel-2:
+            K = exp(-u*D);
+        otherwise
+            error('Distribution=?');                 
+    end
+    
+%estimation:
+    Tk = 0;%index of the sample number examined
+    for num_of_samples = num_of_samples_v
+        Tk = Tk + 1;
+        K_hat_v(Tk) = K_estimation(Y1(:,1:num_of_samples),Y2(:,1:num_of_samples),co);
+        disp(strcat('Tk=',num2str(Tk),'/',num2str(L)));
+    end
+    
+%plot:
+    plot(num_of_samples_v,K_hat_v,'r',num_of_samples_v,ones(L,1)*K,'g');
+    legend({'estimation','analytical value'});
+    xlabel('Number of samples');
+    ylabel('Jensen-Tsallis kernel-2');
+
+

code/estimators/quick_tests/quick_test_Kpos_semidef.m

 	        %cost_name = 'JT_HJT';
 	        %cost_name = 'EJR1_HR';
 	        %cost_name = 'EJR2_DJR';
+	        %cost_name = 'EJT1_HT';
+	        %cost_name = 'EJT2_DJT';
 
 %initialization:    
     num_of_samples_half = floor(num_of_samples/2);
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.