Zoltan Szabo avatar Zoltan Szabo committed 3e697ee

accents: deleted from comments

Comments (0)

Files changed (86)

+-accents: deleted from comments.
+
 v0.13 (Oct 27, 2012):
 -Tsallis entropy is now available in ITE; it can be estimated via k-nearest neighbors, see 'HTsallis_kNN_k_initialization.m', 'HTsallis_kNN_k_estimation.m'.
 -A '/'->'*' typo corrected in 'HRenyi_kNN_k_estimation.m'; see 'estimate_Ialpha.m' (V).
 - its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed systems, as well as to systems
 with nonparametric source dynamics. 
 
-Note: become a [Follower](https://bitbucket.org/szzoli/ite/follow) to be always up-to-date with ITE.
+Note: 
+
+- become a [Follower](https://bitbucket.org/szzoli/ite/follow) to be always up-to-date with ITE.
+- the evolution of the ITE code is briefly summarized in CHANGELOG.txt.

code/H_I_D/H_estimation.m

 function [H] = H_estimation(Y,co)
-%Entropy estimation (H) of Y (Y(:,t) is the t^th) sample, using the specified entropy estimator/cost object
-%co.
+%Entropy estimation (H) of Y (Y(:,t) is the t^th sample) using the specified entropy estimator/cost object co.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/DL2_kNN_k_estimation.m

 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 %
 %REFERENCE: 
-%   Barnab�s P�czos, Zolt�n Szab�, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnab�s P�czos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. UAI-2011.
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
+%   Barnabas Poczos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. UAI-2011.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/DMMDonline_estimation.m

 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 %
 %REFERENCE: 
-%  Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Sch�lkopf and Alexander Smola. A Kernel Two-Sample Test. Journal of Machine  Learning Research 13 (2012) 723-773. See Lemma 14.
+%  Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Scholkopf and Alexander Smola. A Kernel Two-Sample Test. Journal of Machine  Learning Research 13 (2012) 723-773. See Lemma 14.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/DRenyi_kNN_k_estimation.m

 function [D] = DRenyi_kNN_k_estimation(X,Y,co)
-%Estimates the Rnyi divergence (D) of X and Y (X(:,t), Y(:,t) is the t^th sample)
+%Estimates the Renyi divergence (D) of X and Y (X(:,t), Y(:,t) is the t^th sample)
 %using the kNN method (S={k}). The number of samples in X [=size(X,2)] and Y [=size(Y,2)] can be different. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 %
 %REFERENCE: 
-%   Barnab�s P�czos, Zolt�n Szab�, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnab�s P�czos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
-%   Barnab�s P�czos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. UAI-2011.
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
+%   Barnabas Poczos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
+%   Barnabas Poczos, Liang Xiong, Jeff Schneider. Nonparametric Divergence: Estimation with Applications to Machine Learning on Distributions. UAI-2011.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/DRenyi_kNN_k_initialization.m

 function [co] = DRenyi_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Rnyi divergence estimator.
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Renyi divergence estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
             
-    co.alpha = 0.99; %The Rnyi divergence equals to the Kullback-Leibler divergence in limit, i.e., D_{R,alpha} -> KL, provided that alpha ->1.
+    co.alpha = 0.99; %The Renyi divergence equals to the Kullback-Leibler divergence in limit, i.e., D_{R,alpha} -> KL, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/DTsallis_kNN_k_estimation.m

 %We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
 %
 %REFERENCE: 
-%   Barnab�s P�czos, Zolt�n Szab�, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnab�s P�czos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
+%   Barnabas Poczos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/HRenyi_GSF_estimation.m

 function [H] = HRenyi_GSF_estimation(Y,co)
-%Estimates the Rnyi entropy (H) of Y (Y(:,t) is the t^th sample)
+%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the GSF method. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   Barnabs Pczos, Andrs Lrincz. Independent Subspace Analysis Using Geodesic Spanning Trees. ICML-2005, pages 673-680.
+%   Barnabas Poczos, Andras Lorincz. Independent Subspace Analysis Using Geodesic Spanning Trees. ICML-2005, pages 673-680.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/HRenyi_GSF_initialization.m

 function [co] = HRenyi_GSF_initialization(mult)
-%Initialization of the GSF (geodesic spanning forest) based Rnyi entropy estimator.
+%Initialization of the GSF (geodesic spanning forest) based Renyi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
     %Method to compute the geodesic spanning forest:
         co.GSFmethod = 'MatlabBGL_Kruskal';
 
-    co.alpha = 0.99; %The R�nyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Renyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_MST_estimation.m

 function [H] = HRenyi_MST_estimation(Y,co)
-%Estimates the Rnyi entropy (H) of Y (Y(:,t) is the t^th sample)
+%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the minimum spanning tree (MST). Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   Barnabs Pczos, Andrs Lrincz: Independent Subspace Analysis Using Geodesic Spanning Trees. ICML-2005, pages 673-680. (application in ISA)
+%   Barnabas Poczos, Andras Lorincz: Independent Subspace Analysis Using Geodesic Spanning Trees. ICML-2005, pages 673-680. (application in ISA)
 %   Joseph E. Yukich. Probability Theory of Classical Euclidean Optimization Problems, Lecture Notes in Mathematics, 1998, vol. 1675.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/base_estimators/HRenyi_MST_initialization.m

 function [co] = HRenyi_MST_initialization(mult)
-%Initialization of the minimum spanning tree (MST) based Rnyi entropy estimator.
+%Initialization of the minimum spanning tree (MST) based Renyi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
     co.mult = mult;
     
 %other fields:
-    co.alpha = 0.99; %The R�nyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Renyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     %Possibilites for the MST (minimum spanning tree) method:
         co.MSTmethod = 'MatlabBGL_Prim';
         %co.MSTmethod = 'MatlabBGL_Kruskal';

code/H_I_D/base_estimators/HRenyi_kNN_1tok_estimation.m

 function [H] = HRenyi_kNN_1tok_estimation(Y,co)
-%Estimates the Rnyi entropy (H) of Y (Y(:,t) is the t^th sample)
+%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the kNN method (S={1,...,k}). Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   Barnabs Pczos, Andrs Lrincz. Independent Subspace Analysis Using k-Nearest Neighborhood Estimates. ICANN-2005, pages 163-168.
+%   Barnabas Poczos, Andras Lorincz. Independent Subspace Analysis Using k-Nearest Neighborhood Estimates. ICANN-2005, pages 163-168.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/HRenyi_kNN_1tok_initialization.m

 function [co] = HRenyi_kNN_1tok_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={1,...,k}) based Rnyi entropy estimator.
+%Initialization of the kNN (k-nearest neighbor, S={1,...,k}) based Renyi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
 
-    co.alpha = 0.99; %The R�nyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Renyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
 
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_kNN_S_estimation.m

 function [H] = HRenyi_kNN_S_estimation(Y,co)
-%Estimates the R�nyi entropy (H) of Y (Y(:,t) is the t^th sample) using the generalized k-nearest neighbor (S\subseteq {1,...,k}) method. Cost parameters are provided in the cost object co.
+%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample) using the generalized k-nearest neighbor (S\subseteq {1,...,k}) method. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   D�vid P�l, Barnab�s P�czos, Csaba Szepesv�ri: Estimation of R�nyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs. NIPS-2010, pages 1849-1857.
+%   David Pal, Barnabas Poczos, Csaba Szepesvari: Estimation of Renyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs. NIPS-2010, pages 1849-1857.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/HRenyi_kNN_S_initialization.m

 function [co] = HRenyi_kNN_S_initialization(mult)
-%Initialization of the generalized k-nearest neighbor (S\subseteq {1,...,k}) based Rnyi entropy estimator.
+%Initialization of the generalized k-nearest neighbor (S\subseteq {1,...,k}) based Renyi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = [1,2,4];%=S: nearest neighbor set
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
 				
-    co.alpha = 0.99; %The R�nyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.99; %The Renyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_kNN_k_estimation.m

 function [H] = HRenyi_kNN_k_estimation(Y,co)
-%Estimates the Rnyi entropy (H) of Y (Y(:,t) is the t^th sample)
+%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the kNN method (S={k}). Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of R�nyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153�2182, 2008.
+%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153�2182, 2008.
 %   Joseph E. Yukich. Probability Theory of Classical Euclidean Optimization Problems, Lecture Notes in Mathematics, 1998, vol. 1675.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/base_estimators/HRenyi_kNN_k_initialization.m

 function [co] = HRenyi_kNN_k_initialization(mult)
-%Initialization of the kNN (k-nearest neighbor, S={k}) based Rnyi entropy estimator.
+%Initialization of the kNN (k-nearest neighbor, S={k}) based Renyi entropy estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
             %co.k = 3;%k-nearest neighbors
             %co.epsi = 0; %=0: exact kNN; >0: approximate kNN, the true (not squared) distances can not exceed the real distance more than a factor of (1+epsi).
             
-    co.alpha = 0.95; %The R�nyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.95; %The Renyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     
 %initialize the ann wrapper in Octave, if needed:
     initialize_Octave_ann_wrapper_if_needed(co.kNNmethod);

code/H_I_D/base_estimators/HRenyi_weightedkNN_estimation.m

 function [H] = HRenyi_weightedkNN_estimation(Y,co)
-%Estimates the Rnyi entropy (H) of Y (Y(:,t) is the t^th sample)
+%Estimates the Renyi entropy (H) of Y (Y(:,t) is the t^th sample)
 %using the weighted k-nearest neighbor method. Cost parameters are provided in the cost object co.
 %
 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   Kumar Sricharan and Alfred. O. Hero. Weighted k-NN graphs for R�nyi entropy estimation in high dimensions, IEEE Workshop on Statistical Signal Processing (SSP), pages 773-776, 2011. 
+%   Kumar Sricharan and Alfred. O. Hero. Weighted k-NN graphs for Renyi entropy estimation in high dimensions, IEEE Workshop on Statistical Signal Processing (SSP), pages 773-776, 2011. 
 %NOTE:
 %  This code has been written on the basis of implementation kindly provided by Kumar Sricharan.
 %

code/H_I_D/base_estimators/HRenyi_weightedkNN_initialization.m

 function [co] = HRenyi_weightedkNN_initialization(mult)
-%Initialization of the weighted kNN based Rnyi entropy (H) estimator.
+%Initialization of the weighted kNN based Renyi entropy (H) estimator.
 %
 %Note:
 %   1)The estimator is treated as a cost object (co).
     co.mult = mult;
     
 %other fields:
-    co.alpha = 0.95; %The R�nyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
+    co.alpha = 0.95; %The Renyi entropy equals to the Shannon differential entropy, in limit, i.e., Renyi=H_{R,alpha} -> Shannon=H, provided that alpha ->1.
     %Possibilities for 'co.kNNmethod' (see 'kNN_squared_distances.m'): 
         %I: 'knnFP1': fast pairwise distance computation and C++ partial sort; parameter: co.k.
         %II: 'knnFP2': fast pairwise distance computation; parameter: co.k. 												 		

code/H_I_D/base_estimators/HTsallis_kNN_k_estimation.m

 %We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.
 %
 %REFERENCE: 
-%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of R�nyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153�2182, 2008.
+%   Nikolai Leonenko, Luc Pronzato, and Vippal Savani. A class of Renyi information estimators for multidimensional densities. Annals of Statistics, 36(5):2153�2182, 2008.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/IGV_initialization.m

 %OUTPUT:
 %   co: object (structure).
 %REFERENCE: 
-%   Zoltn Szab, Andrs Lrincz: Real and Complex Independent Subspace Analysis by Generalized Variance. ICARN-2006, pages 85-88.
+%   Zoltan Szabo, Andras Lorincz: Real and Complex Independent Subspace Analysis by Generalized Variance. ICARN-2006, pages 85-88.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/IHSIC_estimation.m

 %  ds: subspace dimensions.
 %  co: initialized mutual information estimator object.
 %REFERENCE:
-%   Arthur Gretton,  Olivier Bousquet, Alexander Smola and Bernhard Sch�lkopf: Measuring Statistical Dependence with Hilbert-Schmidt Norms. ALT 2005, 63-78.
+%   Arthur Gretton, Olivier Bousquet, Alexander Smola and Bernhard Scholkopf: Measuring Statistical Dependence with Hilbert-Schmidt Norms. ALT 2005, 63-78.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/base_estimators/IKCCA_estimation.m

 %  ds: subspace dimensions.
 %  co: initialized mutual information estimator object.
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, Andr�s L�rincz: Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007. (multidimensional case, i.e., ds(j)>=1)
+%   Zoltan Szabo, Barnabas Poczos, Andras Lorincz: Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007. (multidimensional case, i.e., ds(j)>=1)
 %   Francis Bach, Michael I. Jordan. Kernel Independent Component Analysis. Journal of Machine Learning Research, 3: 1-48, 2002. (one-dimensional case, i.e., ds(1)=ds(2)=...=ds(end)=1)
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/base_estimators/IKGV_estimation.m

 %  ds: subspace dimensions.
 %  co: initialized mutual information estimator object.
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, Andr�s L�rincz: Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007. (multidimensional case, i.e., ds(j)>=1)
+%   Zoltan Szabo, Barnabas Poczos, Andras Lorincz: Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007. (multidimensional case, i.e., ds(j)>=1)
 %   Francis Bach, Michael I. Jordan. Kernel Independent Component Analysis. Journal of Machine Learning Research, 3: 1-48, 2002. (one-dimensional case, i.e., ds(1)=ds(2)=...=ds(end)=1)
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/base_estimators/ISW1_estimation.m

 %  ds: subspace dimensions.
 %  co: initialized mutual information estimator object.
 %REFERENCE:
-%  Sergey Kirshner and Barnab�s P�czos. ICA and ISA Using Schweizer-Wolff Measure of Dependence. International Conference on Machine Learning (ICML), 464-471, 2008.
+%  Sergey Kirshner and Barnabas Poczos. ICA and ISA Using Schweizer-Wolff Measure of Dependence. International Conference on Machine Learning (ICML), 464-471, 2008.
 %  B. Schweizer and E. F. Wolff. On Nonparametric Measures of Dependence for Random Variables. The Annals of Statistics 9:879-885, 1981.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/base_estimators/ISWinf_estimation.m

 %  ds: subspace dimensions.
 %  co: initialized mutual information estimator object.
 %REFERENCE:
-%  Sergey Kirshner and Barnab�s P�czos. ICA and ISA Using Schweizer-Wolff Measure of Dependence. International Conference on Machine Learning (ICML), 464-471, 2008.
+%  Sergey Kirshner and Barnabas Poczos. ICA and ISA Using Schweizer-Wolff Measure of Dependence. International Conference on Machine Learning (ICML), 464-471, 2008.
 %  B. Schweizer and E. F. Wolff. On Nonparametric Measures of Dependence for Random Variables. The Annals of Statistics 9:879-885, 1981.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/meta_estimators/DJdistance_initialization.m

     co.mult = mult;
     
 %other fields:
-    co.member_name = 'Renyi_kNN_k'; %you can change it to any Kullback-Leibler entropy estimator, the R�nyi divergence converges to the Shannon's as \alpha->1.
+    co.member_name = 'Renyi_kNN_k'; %you can change it to any Kullback-Leibler entropy estimator, the Renyi divergence converges to the Shannon's as \alpha->1.
     co.member_co = D_initialization(co.member_name,mult);

code/H_I_D/meta_estimators/HRPensemble_estimation.m

 %   Y: Y(:,t) is the t^th sample.
 %  co: entropy estimator object.
 %REFERENCE: 
-%	Zolt�n Szab�, Andr�s L�rincz: Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles. ICA 2009, pages 146-153.
+%	Zoltan Szabo, Andras Lorincz: Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles. ICA 2009, pages 146-153.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/meta_estimators/IL2_DL2_estimation.m

 %  ds: subspace dimensions.
 %  co: mutual information estimator object.
 %REFERENCE:
-%   Barnab�s P�czos, Zolt�n Szab�, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/meta_estimators/IMMD_DMMD_estimation.m

 %  ds: subspace dimensions.
 %  co: initialized mutual information estimator object.
 %REFERENCE:
-%  Barnabs Pczos, Zoubin Ghahramani, Jeff Schneider. Copula-based Kernel Dependency Measures, ICML-2012.
+%  Barnabas Poczos, Zoubin Ghahramani, Jeff Schneider. Copula-based Kernel Dependency Measures, ICML-2012.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/meta_estimators/IRenyi_DRenyi_estimation.m

 function [I] = IRenyi_DRenyi_estimation(Y,ds,co)
-%Estimates Rnyi mutual information (I) making use of an(y) estimator for Rnyi divergence; co is the cost object.
+%Estimates Renyi mutual information (I) making use of an(y) estimator for Renyi divergence; co is the cost object.
 %This is a  "meta" method, using the relation: I(y^1,...,y^M) = D(f_y,\prod_{m=1}^M f_{y^m}).
 %
 %We make use of the naming convention 'I<name>_estimation', to ease embedding new mutual information estimation methods.
 %  ds: subspace dimensions.
 %  co: mutual information estimator object.
 %REFERENCE:
-%   Barnab�s P�czos, Zolt�n Szab�, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnab�s P�czos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
+%   Barnabas Poczos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/meta_estimators/IRenyi_DRenyi_initialization.m

 function [co] = IRenyi_DRenyi_initialization(mult)
-%Initialization of the "meta" R�nyi mutual information estimator, which is 
-%   1)based on an(y) estimator for R�nyi  divergence,
+%Initialization of the "meta" Renyi mutual information estimator, which is 
+%   1)based on an(y) estimator for Renyi divergence,
 %   2)is treated as a cost object (co). 
 %Mutual information is estimated using the relation: I(y^1,...,y^M) = D(f_y,\prod_{m=1}^M f_{y^m}).
 %
     co.alpha = 0.95;
 	
 %other fields:    
-    co.member_name = 'Renyi_kNN_k'; %you can change it to any Rnyi divergence estimator
+    co.member_name = 'Renyi_kNN_k'; %you can change it to any Renyi divergence estimator
     co.member_co = D_initialization(co.member_name,mult);
     co.member_co.alpha = co.alpha;

code/H_I_D/meta_estimators/IRenyi_HRenyi_estimation.m

 function [I_alpha]  = IRenyi_HRenyi_estimation(Y,ds,co)
-%Estimates R�nyi mutual information using the formula: "I_{alpha}(Y) = -H_{alpha}(Z)", where Z =[F_1(Y_1);...;F_d(Y_d)] is the copula transformation of Y; F_i is the cdf of Y_i.
+%Estimates Renyi mutual information using the formula: "I_{alpha}(Y) = -H_{alpha}(Z)", where Z =[F_1(Y_1);...;F_d(Y_d)] is the copula transformation of Y; F_i is the cdf of Y_i.
 %This is a "meta" method, i.e., the H_{alpha} estimator can be arbitrary.
 %
 %We make use of the naming convention 'I<name>_estimation', to ease embedding new mutual information estimation methods.
 %  ds: subspace dimensions.
 %  co: mutual information estimator object.
 %REFERENCE:
-%   Dvid Pl, Barnabs Pczos, Csaba Szepesvri: Estimation of Rnyi Entropy and Mutual Information Based on
+%   David Pal, Barnabas Poczos, Csaba Szepesvari: Estimation of Renyi Entropy and Mutual Information Based on
 %   Generalized Nearest-Neighbor Graphs. NIPS-2010 pages 1849-1857.
-%   Barnabs Pczos, Sergey Krishner, Csaba Szepesvri. REGO: Rank-based Estimation of Rnyi Information using Euclidean Graph
+%   Barnabas Poczos, Sergey Krishner, Csaba Szepesvari. REGO: Rank-based Estimation of Renyi Information using Euclidean Graph
 %   Optimization. AISTATS-2010, pages 605-612.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/H_I_D/meta_estimators/IRenyi_HRenyi_initialization.m

 function [co] = IRenyi_HRenyi_initialization(mult)
-%Initialization of the "meta" Rnyi mutual information estimator. The estimator uses the identity: 
+%Initialization of the "meta" Renyi mutual information estimator. The estimator uses the identity: 
 %I_{alpha}(X) = -H_{alpha}(Z), where Z =[F_1(X_1);...;F_d(X_d)] is the copula transformation of X; F_i is the cdf of X_i.
 %
 %Note:
     co.alpha = 0.99;
     
 %other fields:    
-    co.member_name = 'Renyi_kNN_k'; %you can change it to any Rnyi entropy estimator 
+    co.member_name = 'Renyi_kNN_k'; %you can change it to any Renyi entropy estimator 
     co.member_co = H_initialization(co.member_name,mult);
     co.member_co.alpha = co.alpha;

code/H_I_D/meta_estimators/IShannon_HShannon_initialization.m

     
 %other fields:
     co.member_name = 'Shannon_kNN_k'; %you can change it to any Shannon entropy estimator capable of 'multiplicative 
-	%factor correct' H estimation (mult=1) below. Note: Rnyi entropy (H_{R,alpha}) also gives in limit (alpha->1) the Shannon entropy (H).
+	%factor correct' H estimation (mult=1) below. Note: Renyi entropy (H_{R,alpha}) also gives in limit (alpha->1) the Shannon entropy (H).
     co.member_co = H_initialization(co.member_name,1);%'1': since we use the relation '(*)' (multiplicative factors in entropy estimations are NOT allowed).
    
     

code/H_I_D/meta_estimators/ITsallis_DTsallis_estimation.m

 %We make use of the naming convention 'I<name>_estimation', to ease embedding new mutual information estimation methods.
 %
 %INPUT:
-%   Y: Y(:,t) is t^th sample.
+%   Y: Y(:,t) is the t^th sample.
 %  ds: subspace dimensions.
 %  co: mutual information estimator object.
 %REFERENCE:
-%   Barnab�s P�czos, Zolt�n Szab�, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
-%   Barnab�s P�czos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
+%   Barnabas Poczos, Zoltan Szabo, Jeff Schneider: Nonparametric divergence estimators for Independent Subspace Analysis. EUSIPCO-2011, pages 1849-1853.
+%   Barnabas Poczos, Jeff Schneider: On the Estimation of alpha-Divergences. AISTATS-2011, pages 609-617.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/H_I_D/utilities/ann_octave.m

 function [indices,squared_distances] = ann_octave(Y,Q,co,Y_equals_to_Q)
-%Computes the k nearest neighbor distances of each point (Q(:,t)) in
+%Computes the k-nearest neighbor distances of each point (Q(:,t)) in
 %matrix Q from samples (Y(:,t)) in matrix Y. The number of samples in Q (size(Q,2)) and Y (size(Y,2)) can be different. Parameters in the cost object (co), can be used to fine-tune the kNN search (k,...).
 %Y_equals_to_Q is a flag indicating whether Y is equal to Q or not: true (=1), false (=0).
 %The computations are based on the ann Octave bindings.

code/H_I_D/utilities/estimate_Dalpha.m

 function [D_alpha] = estimate_Dalpha(X,Y,co)
-%Estimates D_alpha = \int p^{\alpha}(x)q^{1-\alpha}(x)dx, the Rnyi and the Tsallis divergences are simple functions of this quantity.
+%Estimates D_alpha = \int p^{\alpha}(x)q^{1-\alpha}(x)dx, the Renyi and the Tsallis divergences are simple functions of this quantity.
 %
 %INPUT:
 %   X: X(:,t) is the t^th sample from the first distribution.

code/H_I_D/utilities/estimate_Ialpha.m

 function [I_alpha] = estimate_Ialpha(Y,co)
-%Estimates I_alpha = \int p^{\alpha}(y)dy, the Rnyi and the Tsallis entropies are simple functions of this quantity. Here, alpha:=co.alpha.
+%Estimates I_alpha = \int p^{\alpha}(y)dy, the Renyi and the Tsallis entropies are simple functions of this quantity. Here, alpha:=co.alpha.
 %
 %INPUT:
 %   Y: Y(:,t) is the t^th sample from the distribution having density p.

code/H_I_D/utilities/kNN_squared_distances.m

 function [squared_distances,indices] = kNN_squared_distances(Y,Q,co,Y_equals_to_Q)
-%Computes the k nearest neighbor distances of each point (Q(:,t)) in
+%Computes the k-nearest neighbor distances of each point (Q(:,t)) in
 %matrix Q from samples (Y(:,t)) in matrix Y. The number of samples in Q (size(Q,2)) and Y (size(Y,2)) can be different. Parameters in the cost object (co), can be used to fine-tune the kNN search (k,...).
 %Y_equals_to_Q is a flag indicating whether Y is equal to Q or not: true (=1), false (=0).
 %

code/IPA/data_generation/datasets/plot_subspaces_2D_trajectory.m

 %Plots the trajectory of signal s, composed of 2-dimensional components.  
 %
 %INPUT:
-%   s(:,t) :t^th sample point
+%   s(:,t): t^th sample point
 %OUTPUT:
 %   h: handle of the figure.
 %

code/IPA/data_generation/datasets/sample_subspaces_3D_geom.m

 %EXAMPLE:
 %   e = sample_subspaces_3D_geom(6,1000);
 %REFERENCE: 
-%   Barnab�s P�czos, Andr�s L�rincz. Independent Subspace Analysis Using k-nearest Neighborhood Distances. International Conference on Artificial Neural Networks (ICANN), pp. 163-168, 2005.
+%   Barnabas Poczos, Andras Lorincz. Independent Subspace Analysis Using k-nearest Neighborhood Distances. International Conference on Artificial Neural Networks (ICANN), pp. 163-168, 2005.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_ABC.m

 %EXAMPLE:
 %   e = sample_subspaces_ABC(26,1000);
 %REFERENCE:
-%   Barnab�s P�czos, Andr�s L�rincz. Independent Subspace Analysis Using Geodesic Spanning Trees. International Conference on Machine Learning   (ICML), pp. 673-680, 2005.
+%   Barnabas Poczos, Andras Lorincz. Independent Subspace Analysis Using Geodesic Spanning Trees. International Conference on Machine Learning   (ICML), pp. 673-680, 2005.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_Aw.m

 %EXAMPLE:
 %   e = sample_subspaces_Aw(50,1000);
 %REFERENCE:
-%   Zolt�n Szab� and Andr�s L�rincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006.
+%   Zoltan Szabo and Andras Lorincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_GreekABC.m

 %EXAMPLE:
 %   e = sample_subspaces_GreekABC(24,1000);
 %REFERENCE:
-%   Zolt�n Szab� and Andr�s L�rincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006. [Extension to the Greek alphabet = A\omega dataset]
+%   Zoltan Szabo and Andras Lorincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006. [Extension to the Greek alphabet = A\omega dataset]
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_IFS.m

 %EXAMPLE:
 %   e = sample_subspaces_IFS(9,10000);
 %REFERENCE:
-%	Zolt�n Szab�, Barnab�s P�czos, G�bor Szirtes, and Andr�s L�rincz. Post Nonlinear Independent Subspace Analysis. International Conference on Artificial Neural Networks (ICANN), pp. 677-686, 2007.
+%	Zoltan Szabo, Barnabas Poczos, Gabor Szirtes, and Andras Lorincz. Post Nonlinear Independent Subspace Analysis. International Conference on Artificial Neural Networks (ICANN), pp. 677-686, 2007.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_all_k_independent.m

 %EXAMPLE:
 %   e = sample_subspaces_all_k_independent(num_of_comps,num_of_samples,k);
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, and Andr�s L�rincz. Cross-Entropy  Optimization for Independent Process Analysis. International Conference   on Independent Component Analysis and Blind Source Separation (ICA 2006), pp. 909-916, 2006. (general k)
-%   Barnab�s P�czos, Andr�s L�rincz. Independent Subspace Analysis Using Geodesic Spanning Trees. International Conference on Machine Learning (ICML), pp. 673-680, 2005. (k=2)
+%   Zoltan Szabo, Barnabas Poczos, and Andras Lorincz. Cross-Entropy  Optimization for Independent Process Analysis. International Conference   on Independent Component Analysis and Blind Source Separation (ICA 2006), pp. 909-916, 2006. (general k)
+%   Barnabas Poczos, Andras Lorincz. Independent Subspace Analysis Using Geodesic Spanning Trees. International Conference on Machine Learning (ICML), pp. 673-680, 2005. (k=2)
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_ikeda.m

 %EXAMPLE:
 %   e = sample_subspaces_ikeda(2,1000);
 %REFERENCE:
-%   Zoltn Szab, Barnabs Pczos. Nonparametric Independent Process Analysis. European Signal Processing Conference (EUSIPCO), pp. 1718-1722, 2011.
+%   Zoltan Szabo, Barnabas Poczos. Nonparametric Independent Process Analysis. European Signal Processing Conference (EUSIPCO), pp. 1718-1722, 2011.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_lorenz.m

 %EXAMPLE:
 %   e = sample_subspaces_lorenz(3,1000);
 %REFERENCE: 
-%   Zolt�n Szab�, Barnab�s P�czos, and Andr�s L�rincz. Auto-Regressive Independent Process Analysis without Combinatorial Efforts. Pattern Analysis and Applications, 13:1-13, 2010. 
+%   Zoltan Szabo, Barnabas Poczos, and Andras Lorincz. Auto-Regressive Independent Process Analysis without Combinatorial Efforts. Pattern Analysis and Applications, 13:1-13, 2010. 
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_multiD_geom.m

 %   e = sample_multiD_geom(1000,[2;2;2]);%1000 samples from the 'multi2-2-2-geom' = 'multi2-geom' dataset.
 %   e = sample_multiD_geom(1000,[2;3;4]);%1000 samples from the 'multi2-3-4-geom' dataset.
 %REFERENCE:
-%   Zolt�n Szab� and Andr�s L�rincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006.
+%   Zoltan Szabo and Andras Lorincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/data_generation/datasets/sample_subspaces_multiD_spherical.m

 %   e = sample_multiD_spherical(1000,[2;2;2]);%1000 samples from the 'multi2-2-2-spherical' = 'multi2-spherical' dataset.
 %   e = sample_multiD_spherical(1000,[2;3;4]);%1000 samples from the 'multi2-3-4-spherical' dataset.
 %REFERENCE:
-%   Zolt�n Szab� and Andr�s L�rincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006.
+%   Zoltan Szabo and Andras Lorincz. Real and Complex Independent Subspace Analysis by Generalized Variance. ICA Research Network International Workshop (ICARN), pp. 85-88, 2006.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/demo_ARX_IPA.m

             ICA_method = 'fastICA'; %see 'estimate_ICA.m'
             %ISA cost (during the clustering of the ICA elements):
                 cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
                 opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
                 %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         %ARX:                

code/IPA/demos/demo_AR_IPA.m

             ICA_method = 'fastICA'; %see 'estimate_ICA.m'
             %ISA cost (during the clustering of the ICA elements):
                 cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
                 opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
                 %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         

code/IPA/demos/demo_ISA.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_1tok'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'Renyi_kNN_1tok'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         

code/IPA/demos/demo_MA_IPA_LPA.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     

code/IPA/demos/demo_PNL_ISA.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     %gaussianization:

code/IPA/demos/demo_complex_ISA.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         

code/IPA/demos/demo_fAR_IPA.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     %fAR:

code/IPA/demos/demo_mAR_IPA.m

             ICA_method = 'fastICA'; %see 'estimate_ICA.m'
             %ISA cost (during the clustering of the ICA elements):
                 cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
                 opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
                 %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         

code/IPA/demos/demo_uMA_IPA_LPA.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
      %ARfit:

code/IPA/demos/demo_uMA_IPA_TCC.m

         ICA_method = 'fastICA'; %see 'estimate_ICA.m'
         %ISA cost (during the clustering of the ICA elements):
             cost_type = 'Ipairwise1d'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'GV'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are R�nyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            cost_name = 'GV'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
             opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
             %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     

code/IPA/demos/estimate_ARX.m

 %   Bx_hat: estimated matrix describing the effect of the control on observation x.
 %   obs: observations, obs(:,t): observation at time t.
 %REFERENCE:
-%   Barnab�s P�czos, Andr�s L�rincz. Identification of Recurrent Neural Networks by Bayesian Interrogation Techniques. Journal of Machine Learning Research 10 (2009) 515-554. [NIW method]
+%   Barnabas Poczos, Andras Lorincz. Identification of Recurrent Neural Networks by Bayesian Interrogation Techniques. Journal of Machine Learning Research 10 (2009) 515-554. [NIW method]
 %NOTE: 
 %   Fx,Bx, and Ae are used for observation generation only.
 %

code/IPA/demos/estimate_ARX_IPA.m

 %   Du: dimension of the control.
 %   u_size: size of the control; box constraint, i.e., |u_i| <= u_size
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   Bx_hat: estimated matrix describing the effect of the control on observation x.
 %   s_hat: estimated source.
 %REFERENCE:
-%   Zolt�n Szab� and Andr�s L�rincz. Towards Independent Subspace Analysis in Controlled Dynamical Systems. ICA Research Network International Workshop (ICARN), pages 9-12, 2008.
+%   Zoltan Szabo and Andras Lorincz. Towards Independent Subspace Analysis in Controlled Dynamical Systems. ICA Research Network International Workshop (ICARN), pages 9-12, 2008.
 %NOTE: 
 %   Fx,Bx, and Ae are used for observation generation only.
 %

code/IPA/demos/estimate_AR_IPA.m

 %      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
 %      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   Fx_hat: estimated observation dynamics.
 %   Fs_hat: estimated source dynamics.
 %REFERENCE:
-%  Barnab�s P�czos, B�lint Tak�cs, and Andr�s L�rincz. Independent Subspace Analysis on Innovations. European Conference on Machine Learning (ECML), pp. 698-706, 2005. (multidimensional sources)
+%  Barnabas Poczos, Balint Takacs, and Andras Lorincz. Independent Subspace Analysis on Innovations. European Conference on Machine Learning (ECML), pp. 698-706, 2005. (multidimensional sources)
 %  Aapo Hyvarinen. Independent component analysis for time-dependent stochastic processes. International Conference on Artificial Neural Networks (ICANN), pp. 541�546, 1998. (one-dimensional sources)
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/IPA/demos/estimate_ISA.m

 %INPUT:
 %   x: x(:,t) is the observation at time t.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   W_hat: estimated ISA demixing matrix.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, Andr�s L�rincz: Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007. (proof; sufficient conditions for the ISA separation theorem)
+%   Zoltan Szabo, Barnabas Poczos, Andras Lorincz: Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007. (proof; sufficient conditions for the ISA separation theorem)
 %   Jean-Francois Cardoso. Multidimensional independent component analysis. International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pages 1941�1944, 1998. (conjecture)
 %   Lieven De Lathauwer, Bart De Moor, and Joos Vandewalle. Fetal electrocardiogram extraction by source subspace separation. In IEEE SP/Athos Workshop on Higher-Order Statistics, pages 134�138, 1995. ('conjecture')
 %

code/IPA/demos/estimate_MA_IPA_LPA.m

 %      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
 %      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   e_hat: e_hat(:,t) is the estimated source at time t.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %REFERENCE:
-%  	Zolt�n Szab�. Complete Blind Subspace Deconvolution. International Conference on Independent Component Analysis and Signal Separation (ICA), pages 138-145, 2009.
+%  	Zoltan Szabo. Complete Blind Subspace Deconvolution. International Conference on Independent Component Analysis and Signal Separation (ICA), pages 138-145, 2009.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_PNL_ISA.m

 %Estimates the PNL-ISA model. Method: gaussianization + ISA on the result of the gaussianization.
 %
 %INPUT:
-%   x: x(:,t) observation at time t.
+%   x: x(:,t) is the observation at time t.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   W_hat: estimated ISA demixing matrix.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, G�bor Szirtes, and Andr�s L�rincz. Post Nonlinear Independent Subspace Analysis. International Conference on Artificial Neural Networks (ICANN), pages 677-686, 2007.
+%   Zoltan Szabo, Barnabas Poczos, Gabor Szirtes, and Andras Lorincz. Post Nonlinear Independent Subspace Analysis. International Conference on Artificial Neural Networks (ICANN), pages 677-686, 2007.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_complex_ISA.m

 %INPUT:
 %   x: x(:,t) is the observation at time t.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   de_real_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_real_hat = 2*de; else it contains the estimated (real) subspace dimensions; ordered increasingly.
 %   e_hat: e_hat(:,t) is the estimated (complex) ISA source at time t.
 %REFERENCE:
-%   Zoltn Szab and Andrs Lrincz. Complex Independent Process Analysis. Acta Cybernetica 19:177-190, 2009.
+%   Zoltan Szabo and Andras Lorincz. Complex Independent Process Analysis. Acta Cybernetica 19:177-190, 2009.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_complex_ISA_C.m

 %INPUT:
 %   x: x(:,t) is the observation at time t.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   W_hat: estimated ISA demixing matrix.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, and Andr�s L�rincz. Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007.
+%   Zoltan Szabo, Barnabas Poczos, and Andras Lorincz. Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_fAR_IPA.m

 %   x: x(:,t) is the t^th observation from the fAR model.
 %   L: fAR order.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %   s_hat: s_hat(:,t) is the estimated source at time t.
 %REFERENCE:
-%  Zolt�n Szab� and Barnab�s P�czos. Nonparametric Independent Process Analysis. European Signal Processing Conference (EUSIPCO), pages 1718-1722, 2011.
+%  Zoltan Szabo and Barnabas Poczos. Nonparametric Independent Process Analysis. European Signal Processing Conference (EUSIPCO), pages 1718-1722, 2011.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_gaussianization.m

 %OUTPUT:
 %   s_transformed(:,t): variable at time t, transformed variant of s.
 %REFERENCE:
-%   'rank': Andreas Ziehe, Motoaki Kawanabe, Stefan Harmeling, Klaus-Robert M�ller. Blind separation of postnonlinear mixtures using linearizing transformations and temporal decorrelation. Journal of Machine Learning Research 4 (2003) 1319-1338.
+%   'rank': Andreas Ziehe, Motoaki Kawanabe, Stefan Harmeling, Klaus-Robert Muller. Blind separation of postnonlinear mixtures using linearizing transformations and temporal decorrelation. Journal of Machine Learning Research 4 (2003) 1319-1338.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_mAR_IPA.m

 %   ARmethod_parameters:
 %      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   Fx_hat: estimated observation dynamics.
 %   Fs_hat: estimated source dynamics.
 %REFERENCE:
-%  Zolt�n Szab�. Autoregressive Independent Process Analysis with Missing Observations. European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), pp. 159-164, 2010.
+%  Zoltan Szabo. Autoregressive Independent Process Analysis with Missing Observations. European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), pp. 159-164, 2010.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_uMA_IPA_LPA.m

 %      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
 %      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   W_hat: estimated demixing matrix.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %REFERENCE:
-% 	Zolt�n Szab�, Barnab�s P�czos, and Andr�s L�rincz. Undercomplete Blind Subspace Deconvolution via Linear Prediction. European Conference on Machine Learning (ECML), pages 740-747, 2007.
+% 	Zoltan Szabo, Barnabas Poczos, and Andras Lorincz. Undercomplete Blind Subspace Deconvolution via Linear Prediction. European Conference on Machine Learning (ECML), pages 740-747, 2007.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/demos/estimate_uMA_IPA_TCC.m

 %   x: x(:,t) is the observation at time t.
 %   L: length of the convolution; H_0,...,H_{L}: L+1 H_j matrices.
 %   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated R�nyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %   L2: determines (i) the number times the subspaces are recovered (L+L2), (ii) the dimension of the associated ISA task (De x (L+L2)).
 %REFERENCE:
-%	Zolt�n Szab�, Barnab�s P�czos, and Andr�s L�rincz. Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007.
+%	Zoltan Szabo, Barnabas Poczos, and Andras Lorincz. Undercomplete Blind Subspace Deconvolution. Journal of Machine Learning Research 8(May):1063-1095, 2007.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/IPA/optimization/clustering_UD0_CE_general.m

 %OUTPUT:
 %   perm_ICA: permutation of the ICA elements.
 %REFERENCE:
-%   Zolt�n Szab�, Barnab�s P�czos, Andr�s L�rincz: Cross-Entropy Optimization for Independent Process Analysis. ICA 2006, pages 909�916. (adaptation to the ISA problem)
+%   Zoltan Szabo, Barnabas Poczos, Andras Lorincz: Cross-Entropy Optimization for Independent Process Analysis. ICA 2006, pages 909�916. (adaptation to the ISA problem)
 %   Reuven Y. Rubinstein, Dirk P. Kroese. The Cross-Entropy Method. Springer, 2004. (TSP problem; TSP=travelling salesman problem)
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/IPA/optimization/clustering_UD0_CE_pairadditive_wrt_coordinates.m

 %   perm_ICA: permutation of the ICA elements.
 %REFERENCE:
 %   This function is a simple adaptation of the works:
-%       Zolt�n Szab�, Barnab�s P�czos, Andr�s L�rincz: Cross-Entropy Optimization for Independent Process Analysis. ICA 2006, pages 909�916. (adaptation to the ISA problem)
+%       Zoltan Szabo, Barnabas Poczos, Andras Lorincz: Cross-Entropy Optimization for Independent Process Analysis. ICA 2006, pages 909�916. (adaptation to the ISA problem)
 %       Reuven Y. Rubinstein, Dirk P. Kroese. The Cross-Entropy Method. Springer, 2004. (TSP problem; TSP=travelling salesman problem)
 %   to the pairwise similarity (S) based (see cost_type = 'Ipairwise1d') ISA formulation.
 %

code/IPA/optimization/clustering_UD1.m

 %   perm_ICA: permutation of the ICA elements.
 %   ds_hat: estimated subspace dimensions; length(ds_hat) = num_of_comps.
 %REFERENCE:
-%   Barnab�s P�czos, Zolt�n Szab�, Melinda Kiszlinger, Andr�s L�rincz: Independent Process Analysis without A Priori Dimensional Information. ICA-2007, pages 252-259. (application of the NCut method in IPA)
+%   Barnabas Poczos, Zoltan Szabo, Melinda Kiszlinger, Andras Lorincz: Independent Process Analysis without A Priori Dimensional Information. ICA-2007, pages 252-259. (application of the NCut method in IPA)
 %   Jianbo Shi and Jitendra Malik. Normalized Cuts and Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 8, 2000. (NCut method)
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/shared/Amari_index_ISA.m

 %   amari_ind = Amari_index_ISA([4 -2 2; 2 1 1; 2 1 1],[1,2],'uniform',1); %result:1.
 %   amari_ind = Amari_index_ISA([5 -3 4; 3 5/2 5/2; 4 5/2 5/2],[1,2],'uniform',2); %result:1.
 %REFERENCE:
-%  	Zolt�n Szab�, Barnab�s P�czos: Nonparametric Independent Process Analysis. EUSIPCO-2011, pages 1718-1722. (general d_m-dimensional subspaces, ISA).
+%  	Zoltan Szabo, Barnabas Poczos: Nonparametric Independent Process Analysis. EUSIPCO-2011, pages 1718-1722. (general d_m-dimensional subspaces, ISA).
 %   Shun-ichi Amari, Andrzej Cichocki, and Howard H. Yang. A new learning algorithm for blind signal separation. NIPS '96, pages 757�763. (one-dimensional case, i.e., ICA).
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/shared/C2R_vector.m

 %INPUT:
 %  M_complex: DxT sized complex matrix.
 %OUTPUT:
-%  M_real: (2D)xT size real matrix.
+%  M_real: (2D)xT sized real matrix.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/shared/R2C_vector.m

 %This function is the inverse of 'C2R_vector.m'.
 %
 %INPUT:
-%  M_real: (2D)xT size real matrix.
+%  M_real: (2D)xT sized real matrix.
 %OUTPUT:
 %  M_complex: DxT sized complex matrix.
 %

code/shared/complex_whiten.m

 %Transforms variable e to have zero expectation and identity covariance.
 %
 %INPUT:
-%   e: e(:,t) sample at time t.
+%   e: e(:,t) is the sample at time t.
 %OUTPUT:
-%   e_whitened: whitened version of e. e_whitened(:,t) sample at time t.
+%   e_whitened: whitened version of e. e_whitened(:,t) is the sample at time t.
 %   C_whiten: whitening transformation.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/shared/estimate_AR_E4.m

 %   x_innovation_hat: estimated innovation of x, at time t it is x_innovation_hat(:,t).
 %   Fx_hat: estimated dynamics of x.
 %REFERENCE:
-%   Miguel Jerez, Jose Casals, Sonia Sotoca. Signal Extraction for Linear State-Space Models. Lambert Academic Publishing GmbH & Co. KG, Saarbr�cken (Germany), 2011.
+%   Miguel Jerez, Jose Casals, Sonia Sotoca. Signal Extraction for Linear State-Space Models. Lambert Academic Publishing GmbH & Co. KG, Saarbrucken (Germany), 2011.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/shared/estimate_AR_NIW.m

 %   x_innovation_hat: estimated innovation of x, at time t it is x_innovation_hat(:,t).
 %   Fx_hat: estimated dynamics of x.
 %REFERENCES:
-%   Special case of 'Barnab�s P�czos, Andr�s L�rincz. Identification of Recurrent Neural Networks by Bayesian Interrogation Techniques. Journal of Machine Learning Research 10 (2009) 515-554.'
+%   Special case of 'Barnabas Poczos, Andras Lorincz. Identification of Recurrent Neural Networks by Bayesian Interrogation Techniques. Journal of Machine Learning Research 10 (2009) 515-554.'
 %   K. Rao Kadiyala and Sune Karlsson. Numerical methods for estimation and inference in Bayesian-VAR models. Journal of Applied Econometrics, 12:99-132, 1997.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

code/shared/estimate_mAR_E4.m

 %   x_innovation_hat: estimated innovation of x, at time t it is x_innovation_hat(:,t).
 %   Fx_hat: estimated dynamics of x.
 %REFERENCE:
-%   Miguel Jerez, Jose Casals, Sonia Sotoca. Signal Extraction for Linear State-Space Models. Lambert Academic Publishing GmbH & Co. KG, Saarbr�cken (Germany), 2011.
+%   Miguel Jerez, Jose Casals, Sonia Sotoca. Signal Extraction for Linear State-Space Models. Lambert Academic Publishing GmbH & Co. KG, Saarbrucken (Germany), 2011.
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %

code/shared/estimate_mAR_NIW.m

 %   x_innovation_hat: estimated innovation of x, at time t it is x_innovation_hat(:,t).
 %   Fx_hat: estimated dynamics of x.
 %REFERENCES:
-%   Special case (=no control) of 'Barnab�s P�czos, Andr�s L�rincz. Identification of Recurrent Neural Networks by Bayesian Interrogation Techniques. Journal of Machine Learning Research 10 (2009) 515-554.'
+%   Special case (=no control) of 'Barnabas Poczos, Andras Lorincz. Identification of Recurrent Neural Networks by Bayesian Interrogation Techniques. Journal of Machine Learning Research 10 (2009) 515-554.'
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.