Zoltan Szabo avatar Zoltan Szabo committed 2e8afa0

ICA and ISA structures: introduced for unified treatment of the estimators. It will also enable embedding of general ICA optimization algorithms such as the Jacobi method. Some variables: renamed; see ARmethod_parameters -> AR, ARXmethod_parameters -> ARX, fARmethod_parameters -> fAR, MAparameters -> MA, gaussianizationmethod_parameters -> gaussianization. 'estimate_mAR.m': 'stepwiseLS' method deleted. Example 20 (ISA-3): cost_type = 'KGV' typo corrected (-> cost_type = 'Ipairwise1d'; doc). 'kdpee.c': MSVC does not provide log2. A more elegant solution: added.

Comments (0)

Files changed (35)

+v0.32 (Feb 25, 2013):
+-ICA and ISA structures: introduced for unified treatment of the estimators. It will also enable embedding of general ICA optimization algorithms such as the Jacobi method. 
+-Some variables: renamed; see ARmethod_parameters -> AR, ARXmethod_parameters -> ARX, fARmethod_parameters -> fAR, MAparameters -> MA, gaussianizationmethod_parameters -> gaussianization.
+-'estimate_mAR.m': 'stepwiseLS' method deleted.
+-Example 20 (ISA-3): cost_type = 'KGV' typo corrected (-> cost_type = 'Ipairwise1d'; doc).
+-'kdpee.c': MSVC does not provide log2. A more elegant solution: added.
+
 v0.31 (Feb 9, 2013):
 -EASI (equivariant adaptive separation via independence) real/complex ICA method: added; see 'estimate_ICA_EASI.m', 'estimate_ICA.m', 'estimate_complex_ICA.m'.
 -Adaptive (k-d) partitioning based Shannon entropy estimation: added; see 'HShannon_KDP_initialization.m', 'HShannon_KDP_estimation.m'.
 
 **Download** the latest release: 
 
-- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.31_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.31_code.tar.bz2), 
-- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.31_documentation.pdf).
+- code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE-0.32_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE-0.32_code.tar.bz2), 
+- [documentation (pdf)](https://bitbucket.org/szzoli/ite/downloads/ITE-0.32_documentation.pdf).
 
 

code/IPA/data_generation/models/generate_MA_IPA.m

-function [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MAparameters)
+function [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MA)
 %Generates an uMA-IPA/complete MA-IPA model.
 %
 %INPUT:
 %   num_of_comps: number of subspaces, see 'sample_subspaces.m'.
 %   num_of_samples: number of samples.
 %   undercompleteness: dim(observation) = ceil((undercompleteness+1) x dim(source)), >=0.
-%   MAparameters: parameters of the convolution (MAparameters.L: length of the convolution; H_0,...,H_{L}: L+1 H_j matrices), see 'MA_polynomial.m'.
+%   MA: parameters of the convolution (MA.L: length of the convolution; H_0,...,H_{L}: L+1 H_j matrices), see 'MA_polynomial.m'.
 %OUTPUT:
 %   x: x(:,t) is the observation at time t; size(x,2) = num_of_samples.
 %   H: polynomial matrix corresponding to the convolution (=MA).
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %length of the convolution (L):
-    L = MAparameters.L;
+    L = MA.L;
 
 %driving noise/source (e) and subspace dimensions (de):
     [e,de] = sample_subspaces(data_type,num_of_comps,num_of_samples+L);
 
 %convolution (H):
     Dx = ceil((undercompleteness+1) * De); %undercomplete problem (Dx > De)
-    H = MA_polynomial(Dx,De,MAparameters);
+    H = MA_polynomial(Dx,De,MA);
     
 %observation (x):
     E_temp = concatenation(e,L+1);

code/IPA/demos/demo_ARX_IPA.m

         u_size = 0.2; %size of the control; box constraint, i.e., |u_i| <= u_size
         %ISA:
             unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-            ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-            %ISA cost (during the clustering of the ICA elements):
-                cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-                opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-                %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
-        %ARX:                
-            ARXmethod_parameters.method = 'NIW'; %ARX identification method; see 'estimate_ARX_IPA.m'
-            ARXmethod_parameters.Lx = Ls; %AR order
-            ARXmethod_parameters.Lu = Lu; %control ('X') order
+            %ICA solver (see 'estimate_ICA.m'):
+                ICA.opt_type = 'fastICA';
+            %ISA solver (clustering of the ICA elements):
+                ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+                ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+                ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+                %Many of combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ARX solver:                
+            ARX.method = 'NIW'; %ARX identification method; see 'estimate_ARX_IPA.m'
+            ARX.Lx = Ls; %AR order
+            ARX.Lu = Lu; %control ('X') order
             
 %data generation (e,de,Fs,Bs,A,Fx,Bx,Ae,du):
     [e,de,Fs,Bs,A,Fx,Bx,Ae,du] = generate_ARX_IPA_parameters(data_type,num_of_comps,num_of_samples,Ls,F_lambda,Lu,du);
 
 %estimation (e_hat,de_hat,Fs_hat,Bs_hat,W_hat,Fx_hat,Bx_hat,s_hat): 
-    [e_hat,de_hat,Fs_hat,Bs_hat,W_hat,Fx_hat,Bx_hat,s_hat] = estimate_ARX_IPA(Fx,Bx,Ae,sum(de),du,u_size,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,ARXmethod_parameters); %note: Fx,Bx, and Ae are used for observation generation only.
+    [e_hat,de_hat,Fs_hat,Bs_hat,W_hat,Fx_hat,Bx_hat,s_hat] = estimate_ARX_IPA(Fx,Bx,Ae,sum(de),du,u_size,ICA,ISA,unknown_dimensions,de,ARX); %note: Fx,Bx, and Ae are used for observation generation only.
 
 %result:
     %Fx,Fx_hat, %ideally: these two matrices are equal

code/IPA/demos/demo_AR_IPA.m

             F_lambda = 0.7; %AR stability parameter, 0<F_lambda<1, see 'generate_AR_polynomial.m'
     %estimation:
         %ARfit:
-            ARmethod_parameters.L = L;%AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
-            ARmethod_parameters.method = 'stepwiseLS';%AR estimation method, see 'estimate_AR.m'
+            AR.L = L;%AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
+            AR.method = 'stepwiseLS';%AR estimation method, see 'estimate_AR.m'
         %ISA:
             unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-            ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-            %ISA cost (during the clustering of the ICA elements):
-                cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-                opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-                %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+            %ICA solver (see 'estimate_ICA.m'):            
+                ICA.opt_type = 'fastICA';
+            %ISA solver (clustering of the ICA elements):
+                ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+                ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+                ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+                %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         
 %data generation (x,A,s,e,F,de):
     [x,A,s,e,Fs,de,Fx] = generate_AR_IPA(data_type,num_of_comps,num_of_samples,L,F_lambda);
     
 %estimation (s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat):
-    [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_AR_IPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de);
+    [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_AR_IPA(x,AR,ICA,ISA,unknown_dimensions,de);
     
 %result:
     %global matrix(G):

code/IPA/demos/demo_ISA.m

         num_of_samples = 2*1000;%number of samples
     %estimation:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_1tok'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_ICA.m'):
+            %I:
+                ICA.opt_type = 'fastICA'; 
+            %II:
+                %ICA.opt_type = 'EASI';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'Renyi_kNN_1tok'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'.
         
 %data generation (x,A,e,de,num_of_comps):
     [x,A,e,de,num_of_comps] = generate_ISA(data_type,num_of_comps,num_of_samples);
     
 %estimation (s_hat,W_hat,de_hat): 
-    [e_hat,W_hat,de_hat] = estimate_ISA(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x,1));
+    [e_hat,W_hat,de_hat] = estimate_ISA(x,ICA,ISA,unknown_dimensions,de,size(x,1));
   
 %result:
     %global matrix(G):

code/IPA/demos/demo_MA_IPA_LPA.m

         data_type = 'Aw';%see 'sample_subspaces.m'
         num_of_comps = 3;%number of components/subspaces in sampling
         num_of_samples = 2*1000;%number of samples
-        MAparameters.type = 'stable'; %type of the convolution, see 'MA_polynomial.m'.
-        MAparameters.L = 1;%length of the convolution; H_0,...,H_{L}: L+1 H_j matrices
-        MAparameters.lambda = 0.4;%stability parameter of the convolution, |MAparameters.lambda|<1, MAparameters.lambda->0, the 'more stable' the convolution is.
+        MA.type = 'stable'; %type of the convolution, see 'MA_polynomial.m'.
+        MA.L = 1;%length of the convolution; H_0,...,H_{L}: L+1 H_j matrices
+        MA.lambda = 0.4;%stability parameter of the convolution, |MA.lambda|<1, MA.lambda->0, the 'more stable' the convolution is.
     %ARfit:
-         ARmethod_parameters.method = 'stepwiseLS';%AR estimation method, see 'estimate_AR.m'
+         AR.method = 'stepwiseLS';%AR estimation method, see 'estimate_AR.m'
     %ISA:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_ICA.m'):
+            ICA.opt_type = 'fastICA';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     
 %data generation (x,H,e,de,num_of_comps):
     undercompleteness = 0;%complete system
-    [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MAparameters);
+    [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MA);
     
 %estimation via the LPA method (e_hat,de_hat):
-    [e_hat,de_hat] = estimate_MA_IPA_LPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de);
+    [e_hat,de_hat] = estimate_MA_IPA_LPA(x,AR,ICA,ISA,unknown_dimensions,de);
     
 %result:
     %global matrix(G):

code/IPA/demos/demo_PNL_ISA.m

         num_of_samples = 5*1000;%number of samples
     %ISA:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
-    %gaussianization:
-        gaussianizationmethod_parameters.method = 'rank';%For possibilites, see 'estimate_gaussianization.m'.
-        gaussianizationmethod_parameters.c = 0.97; %parameter of the 'rank' method
+        %ICA solver (see 'estimate_ICA.m'):
+            ICA.opt_type = 'fastICA';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+    %gaussianization technique:
+        gaussianization.method = 'rank';%For possibilites, see 'estimate_gaussianization.m'.
+        gaussianization.c = 0.97; %parameter of the 'rank' method
         
 %data generation (x_LIN,A,e,de,num_of_comps):
     [x_LIN,A,e,de,num_of_comps] = generate_ISA(data_type,num_of_comps,num_of_samples);
         h = plot_subspaces(x_PNL,data_type,'post nonlinear mixture (x=g(Ae))');
         
 %estimation (W_hat,e_hat,de_hat):    
-    [W_hat,e_hat,de_hat] = estimate_PNL_ISA(x_PNL,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,gaussianizationmethod_parameters);
+    [W_hat,e_hat,de_hat] = estimate_PNL_ISA(x_PNL,ICA,ISA,unknown_dimensions,de,gaussianization);
 
 %result:
     %global matrix(G):

code/IPA/demos/demo_complex_ISA.m

         num_of_samples = 5*1000;%number of samples
     %(real) ISA:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_ICA.m'):
+            ICA.opt_type = 'fastICA';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         
 %data generation (x,A,e,de,num_of_comps):
     [x,A,e,de,num_of_comps] = generate_complex_ISA(data_type,num_of_comps,num_of_samples);
     
 %estimation (e_real_hat,W_real_hat,de_real_hat,e_hat):
-    [e_real_hat,W_real_hat,de_real_hat,e_hat] = estimate_complex_ISA(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x,1));
+    [e_real_hat,W_real_hat,de_real_hat,e_hat] = estimate_complex_ISA(x,ICA,ISA,unknown_dimensions,de,size(x,1));
 
 %result:
     %global matrix(G):

code/IPA/demos/demo_complex_ISA_C.m

     %dataset:
         data_type = 'multi4-geom';%see 'sample_subspaces.m'
         num_of_comps = 2;%number of components/subspaces in sampling
-        num_of_samples = 5*1000;%number of samples
+        num_of_samples = 10*1000;%number of samples
     %estimation:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_complex_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sum-I'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'complex'; %example: cost_type = 'sumH', cost_name = 'complex' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated via the 'complex' method.
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_complex_ICA.m'):
+            ICA.opt_type = 'EASI'; 
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sum-I'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'complex'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'complex' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated via the 'complex' method.
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         
 %data generation (x,A,e,de,num_of_comps):
     [x,A,e,de,num_of_comps] = generate_complex_ISA(data_type,num_of_comps,num_of_samples);
     
  %estimation (e_hat,W_hat,de_hat):
-    [e_hat,W_hat,de_hat] = estimate_complex_ISA_C(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de);
+    [e_hat,W_hat,de_hat] = estimate_complex_ISA_C(x,ICA,ISA,unknown_dimensions,de);
 
 %result:
     %global matrix(G):

code/IPA/demos/demo_fAR_IPA.m

         L = 1; %fAR order (tested intensively for L=1 only: 'recursive_Nadaraya_Watson_estimator.m') 
     %ISA:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_ICA.m'):
+            ICA.opt_type = 'fastICA';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     %fAR:
-        fARmethod_parameters.beta_normalized = 1/8; %/in (0,1)
-        fARmethod_parameters.method = 'recursiveNW';%fAR estimation method, see 'estimate_fAR.m'        
+        fAR.beta_normalized = 1/8; %/in (0,1)
+        fAR.method = 'recursiveNW'; %fAR estimation method, see 'estimate_fAR.m'
+        fAR.L = L; %fAR order
         
 %data generation (x,A,s,ds,num_of_comps):
     [x,A,s,e,de,num_of_comps] = generate_fAR_IPA(data_type,num_of_comps,num_of_samples,L);
     
 %estimation (e_hat,W_hat,de_hat,s_hat): 
-    [e_hat,W_hat,de_hat,s_hat] = estimate_fAR_IPA(x,L,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,fARmethod_parameters);
+    [e_hat,W_hat,de_hat,s_hat] = estimate_fAR_IPA(x,ICA,ISA,unknown_dimensions,de,fAR);
 
 %result:
     %global matrix(G):

code/IPA/demos/demo_mAR_IPA.m

             F_lambda = 0.7; %AR stability parameter, 0<F_lambda<1, see 'generate_AR_polynomial.m'
         p = 0.05; %p=P(x(i,t) is not observable)
     %estimation:
-        %ARfit:
-            ARmethod_parameters.L = L;%AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_mAR.m'.
-            ARmethod_parameters.method = 'subspace';%mAR estimation method, see 'estimate_mAR.m'
+        %mARfit:
+            mAR.L = L;%mAR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_mAR.m'.
+            mAR.method = 'subspace';%mAR estimation method, see 'estimate_mAR.m'
         %ISA:
             unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-            ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-            %ISA cost (during the clustering of the ICA elements):
-                cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-                cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-                opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-                %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+            %ICA solver (see 'estimate_ICA.m'):
+                ICA.opt_type = 'fastICA';
+            %ISA solver (clustering of the ICA elements):
+                ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+                ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+                ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+                %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
         
 %data generation (y,x,A,s,e,F,de):
     [y,x,A,s,e,Fs,de,Fx] = generate_mAR_IPA(data_type,num_of_comps,num_of_samples,L,F_lambda,p);
     
 %estimation (s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat):
-    [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_mAR_IPA(y,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de);
+    [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_mAR_IPA(y,mAR,ICA,ISA,unknown_dimensions,de);
     
 %result:
     %global matrix(G):

code/IPA/demos/demo_uMA_IPA_LPA.m

         num_of_comps = 3;%number of components/subspaces in sampling
         num_of_samples = 5*1000;%number of samples
         undercompleteness = 1; %dim(x) = ceil((undercompleteness+1) x dim(s))
-        MAparameters.type = 'randn'; %type of the convolution, see 'MA_polynomial.m'.        
-        MAparameters.L = 1;%length of the convolution; H_0,...,H_{L}: L+1 H_j matrices
+        MA.type = 'randn'; %type of the convolution, see 'MA_polynomial.m'.        
+        MA.L = 1;%length of the convolution; H_0,...,H_{L}: L+1 H_j matrices
     %ISA:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'Renyi_kNN_k'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_ICA.m'):
+            ICA.opt_type = 'fastICA';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'sumH'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'Renyi_kNN_k'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
      %ARfit:
-        ARmethod_parameters.L = [1:2*(MAparameters.L+1)];%AR order; %maximal order of the ARfit is 2*(L+1), this choice was robust for undercompleteness=1
-        ARmethod_parameters.method = 'stepwiseLS';%AR estimation method, see 'estimate_AR.m'
+        AR.L = [1:2*(MA.L+1)];%AR order; maximal order of the ARfit is 2*(L+1), this choice was robust for undercompleteness=1
+        AR.method = 'stepwiseLS';%AR estimation method, see 'estimate_AR.m'
     
 %data generation (x,H,e,de,num_of_comps):
-    [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MAparameters);
+    [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MA);
     
 %estimation via the LPA method (e_hat,W_hat,de_hat):
     De = sum(de); %dimension of the source(e)
-    [e_hat,W_hat,de_hat] = estimate_uMA_IPA_LPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,De);
+    [e_hat,W_hat,de_hat] = estimate_uMA_IPA_LPA(x,AR,ICA,ISA,unknown_dimensions,de,De);
     
 %result:
     %global matrix(G):

code/IPA/demos/demo_uMA_IPA_TCC.m

         num_of_comps = 3;%number of components/subspaces in sampling
         num_of_samples = 5*1000;%number of samples
         undercompleteness = 1; %dim(x) = ceil((undercompleteness+1) x dim(s))
-        MAparameters.type = 'randn'; %MA_type: type of the convolution, see 'MA_polynomial.m'.
-        MAparameters.L = 1;%length of the convolution; H_0,...,H_{L}: L+1 H_j matrices        
+        MA.type = 'randn'; %type of the convolution, see 'MA_polynomial.m'.
+        MA.L = 1;%length of the convolution; H_0,...,H_{L}: L+1 H_j matrices        
     %ISA:    
         unknown_dimensions = 0;%0: '{d_m}_{m=1}^M: known'; 1: 'M is known'
-        ICA_method = 'fastICA'; %see 'estimate_ICA.m'
-        %ISA cost (during the clustering of the ICA elements):
-            cost_type = 'Ipairwise1d'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
-            cost_name = 'GV'; %example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
-            opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
-            %A wide variety of combinations are allowed for cost_type, cost_name and opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
+        %ICA solver (see 'estimate_ICA.m'):
+            ICA.opt_type = 'fastICA';
+        %ISA solver (clustering of the ICA elements):
+            ISA.cost_type = 'Ipairwise1d'; %'I','sumH', 'sum-I','Irecursive', 'Ipairwise', 'Ipairwise1d'
+            ISA.cost_name = 'GV'; %example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are Renyi entropies estimated via kNN methods ('Renyi_kNN_1tok').
+            ISA.opt_type = 'greedy';%optimization type: 'greedy', 'CE', 'exhaustive', 'NCut', 'SP1', 'SP2', 'SP3'
+            %Many combinations are allowed for ISA.cost_type, ISA.cost_name and ISA.opt_type, see 'clustering_UD0.m', 'clustering_UD1.m'
     
 %data generation (x,H,e,de,num_of_comps):
-    [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MAparameters);
+    [x,H,e,de,num_of_comps] = generate_MA_IPA(data_type,num_of_comps,num_of_samples,undercompleteness,MA);
     
 %estimation via the TCC method (e_hat,W_hat,de_hat,L2):
-    [e_hat,W_hat,de_hat,L2] = estimate_uMA_IPA_TCC(x,MAparameters.L,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de);
+    [e_hat,W_hat,de_hat,L2] = estimate_uMA_IPA_TCC(x,MA.L,ICA,ISA,unknown_dimensions,de);
 
 %result:
     %mixing matrix and dimenions(de) associated to the TCC based method:
         A = mixing_matrix_TCC(H,de,L2);%A=A(H,...)
-        de = kron(de,ones(MAparameters.L+L2,1));%the subspaces are recovered L+L2-times, in the ideal case
+        de = kron(de,ones(MA.L+L2,1));%the subspaces are recovered L+L2-times, in the ideal case
     %global matrix(G):
         G = W_hat * A;
         hinton_diagram(G,'global matrix (G=WA)');%ideally: block-scaling matrix

code/IPA/demos/estimate_AR.m

-function [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,ARmethod_parameters)
+function [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,AR)
 %AR estimation on observation x.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
-%   ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW' (see 'estimate_AR_NIW.m'), 'subspace', 'subspace-LL', 'LL' (see 'estimate_AR_E4.m')  and 'stepwiseLS' (see 'arfit.m').  
+%   AR: AR estimator,
+%      AR.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
+%      AR.method: AR estimation method. Possibilities: 'NIW' (see 'estimate_AR_NIW.m'), 'subspace', 'subspace-LL', 'LL' (see 'estimate_AR_E4.m')  and 'stepwiseLS' (see 'arfit.m').  
 %NOTE: the 'stepwiseLs' method corresponds to the stepwise least squares estimator of the ARfit Matlab package ("http://www.gps.caltech.edu/~tapio/arfit/"). The ARfit
 %   toolbox has NOT been included in the ITE package because its license is NOT compatible with 'GPLv3 or later'. However, we recommend downloading and 
 %   installing it, and use it as the DEFAULT AR estimator; it is quite efficient, and ITE is ready to call it.   
 %OUTPUT:
 %   x_innovation_hat: estimated innovation of x, at time t it is x_innovation_hat(:,t).
 %   Fx_hat: estimated dynamics of x.
-%   SBCs: computed SBC values if ARmethod_parameters.L is vector, else SBCs = [].
+%   SBCs: computed SBC values if AR.L is a vector, else SBCs = [].
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %initialization:
-    method = ARmethod_parameters.method;
-    Ls = ARmethod_parameters.L;
+    method = AR.method;
+    Ls = AR.L;
     [D,num_of_samples] = size(x);
     disp(strcat(['ARfit: started (method: ',method,').']));
     switch method
         otherwise
             error('AR identification method=?');
     end
-disp('ARfit: ready.');    
+disp('ARfit: ready.');    

code/IPA/demos/estimate_ARX.m

-function [x_innov_hat,Fx_hat,Bx_hat,obs] = estimate_ARX(Fx,Bx,Ae,Dx,Du,u_size,ARXmethod_parameters)
+function [x_innov_hat,Fx_hat,Bx_hat,obs] = estimate_ARX(Fx,Bx,Ae,Dx,Du,u_size,ARX)
 %ARX estimation on observation. The observation is generated online.
 %
 %INPUT:
 %   Dx: dimension of the observation.
 %   Du: dimension of the control.
 %   u_size: size of the control; box constraint, i.e., |u_i| <= u_size
-%   ARXmethod_parameters:
-%       ARXmethod_parameters.method: ARX identification method. Possibilities: 'NIW' (online Bayesian technique with normal-inverted Wishart prior).
-%       ARXmethod_parameters.Lx: AR order.
-%       ARXmethod_parameters.Lu: control ('X') order.
+%   ARX: ARX estimator,
+%       ARX.method: ARX identification method. Possibilities: 'NIW' (online Bayesian technique with normal-inverted Wishart prior).
+%       ARX.Lx: AR order.
+%       ARX.Lu: control ('X') order.
 %OUTPUT:
 %   x_innov_hat: estimated innovation process of x.
 %   Fx_hat: estimated matrix describing the AR evolution of observation x.
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
-%ARXmethod_parameters -> method, Lx, Lu:
-    method = ARXmethod_parameters.method;
-    Lx = ARXmethod_parameters.Lx;
-    Lu = ARXmethod_parameters.Lu;
+%ARX -> method, Lx, Lu:
+    method = ARX.method;
+    Lx = ARX.Lx;
+    Lu = ARX.Lu;
     
     disp(strcat(['ARX identification: started. Method: ', method,'.']));
     switch method

code/IPA/demos/estimate_ARX_IPA.m

-function [e_hat,de_hat,Fs_hat,Bs_hat,W_hat,Fx_hat,Bx_hat,s_hat] = estimate_ARX_IPA(Fx,Bx,Ae,Dx,Du,u_size,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,ARXmethod_parameters)
+function [e_hat,de_hat,Fs_hat,Bs_hat,W_hat,Fx_hat,Bx_hat,s_hat] = estimate_ARX_IPA(Fx,Bx,Ae,Dx,Du,u_size,ICA,ISA,unknown_dimensions,de,ARX)
 %Estimates the ARX-IPA model. Method: ARX identification + ISA on the estimated innovation.
 %
 %INPUT:
 %   Dx: dimension of the observation.
 %   Du: dimension of the control.
 %   u_size: size of the control; box constraint, i.e., |u_i| <= u_size
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %       2)in case of 'unknown_dimensions = 1': the length of 'de' must be equal to the number of subspaces, but the coordinates of the vector can be arbitrary.
-%   ARXmethod_parameters: parameters of the ARX identification method, see 'estimate_ARX.m'.
+%   ARX: ARX estimator, see 'estimate_ARX.m'.
 %OUTPUT:
 %   e_hat: e_hat(:,t) is the estimated driving noise at time t.
 %   de_hat: in case of known subspace dimensions ('unknown_dimensions = 0') de_hat = de; else it contains the estimated subspace dimensions; ordered increasingly.
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %ARX-identification = estimate dynamics (Fx_hat, Bx_hat) and innovation (x_innov_hat):
-    [x_innov_hat,Fx_hat,Bx_hat,observation] = estimate_ARX(Fx,Bx,Ae,Dx,Du,u_size,ARXmethod_parameters);%the observation is generated online
+    [x_innov_hat,Fx_hat,Bx_hat,observation] = estimate_ARX(Fx,Bx,Ae,Dx,Du,u_size,ARX);%the observation is generated online
         
 %ISA on the estimation innovation:    
-    [e_hat,W_hat,de_hat] = estimate_ISA(x_innov_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x_innov_hat,1));    
+    [e_hat,W_hat,de_hat] = estimate_ISA(x_innov_hat,ICA,ISA,unknown_dimensions,de,size(x_innov_hat,1));    
     
 %basis tranformation (x->s): Fs_hat,Bs_hat, s_hat:
     %Fs_hat,Bs_hat:

code/IPA/demos/estimate_AR_IPA.m

-function [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_AR_IPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de)
+function [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_AR_IPA(x,AR,ICA,ISA,unknown_dimensions,de)
 %Estimates the AR-IPA model. Method: AR identification + ISA on the estimated innovation.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ARmethod_parameters:
-%      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
-%      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   AR: AR estimator, see 'estimate_AR.m'.
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %AR fit to x:
-   [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,ARmethod_parameters);
+   [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,AR);
    
 %ISA on the estimated innovation of x:
-   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x_innovation_hat,1));
+   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA,ISA,unknown_dimensions,de,size(x_innovation_hat,1));
    
 %estimated source(s_hat) and its dynamics:
    s_hat = W_hat * x;

code/IPA/demos/estimate_ICA.m

-function [e_hat,W_hat]  = estimate_ICA(x,ICA_method,dim_reduction)
+function [e_hat,W_hat]  = estimate_ICA(x,ICA,dim_reduction)
 %Performs ICA on signal x.
 %
 %INPUT:
 %   x: x(:,t) is the t^th sample.
-%   ICA_method: the method to perform ICA with. ('fastICA')
+%   ICA: the method to perform independent component analysis. Possibilities: 
+%      1)ICA.opt_type = 'fastICA'.
+%      2)ICA.opt_type = 'EASI'.
 %   dim_reduction: <=dim(x)[= size(x,1)]; in case of '<', dimension reduction is also carried out.
 %OUTPUT:
 %   e_hat: estimated ICA elements; e_hat(:,t) is the t^th sample.
 
 disp('ICA estimation: started.');
 
-switch ICA_method
+switch ICA.opt_type
     case 'fastICA'
         %whitening:
             %1)If (i) e is white and (ii) A is orthogonal, then this step can be discarded.
-            %2)Alternative: perform whitening in ICA directly.
+            %2)Alternative: perform whitening in ICA directly. It is often
+            %advantageous to carry out whitening separately since it reduces the number
+            %of parameters to be optimized (approximately) to the half. [A dxd sized 'W' demixing matrix has d^2 parameters, a dxd sized ortogonal matrix can be described by only dx(d+1)/2 parameters.]
             [x,W_whiten] = whiten(x,dim_reduction);
         [e_hat,A_ICA,W_ICA] = fastica(x, 'whiteSig',x,'whiteMat',eye(dim_reduction),'dewhiteMat',eye(dim_reduction), ...
                                      'approach', 'symm', 'g', 'tanh','displayMode', 'off','verbose','off',  ...

code/IPA/demos/estimate_ISA.m

-function [e_hat,W_hat,de_hat] = estimate_ISA(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,dim_reduction)
+function [e_hat,W_hat,de_hat] = estimate_ISA(x,ICA,ISA,unknown_dimensions,de,dim_reduction)
 %Estimates the ISA model. Method: ICA + clustering of the ICA elements (=ISA separation theorem).
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 disp('ISA estimation: started.');
 
 %ICA step (e_ICA,W_ICA):
-    [e_ICA,W_ICA] = estimate_ICA(x,ICA_method,dim_reduction);
+    [e_ICA,W_ICA] = estimate_ICA(x,ICA,dim_reduction);
     
 %clustering of the ICA elements (=permutation search; perm_ICA):
     switch unknown_dimensions
         case 0
-            perm_ICA = clustering_UD0(e_ICA,de,opt_type,cost_type,cost_name);%'UD0': unknown_dimensions=0
+            perm_ICA = clustering_UD0(e_ICA,de,ISA.opt_type,ISA.cost_type,ISA.cost_name);%'UD0': unknown_dimensions=0
             de_hat = de;%it is given
         case 1
-            if strcmp(cost_type,'Ipairwise1d')
+            if strcmp(ISA.cost_type,'Ipairwise1d')
                 num_of_comps = length(de);
-                [perm_ICA,de_hat] = clustering_UD1(e_ICA,num_of_comps,opt_type,cost_name);%'UD1': unknown_dimensions=1                    
+                [perm_ICA,de_hat] = clustering_UD1(e_ICA,num_of_comps,ISA.opt_type,ISA.cost_name);%'UD1': unknown_dimensions=1                    
                 [de_hat,per] = sort_subspaces_dimensions(de_hat);%sort the subspaces in increasing order with respect to their dimensions.
                 perm_ICA = perm_ICA(per);%apply permutation 'perm_ICA' and then permutation 'per'
             else

code/IPA/demos/estimate_MA_IPA_LPA.m

-function [e_hat,de_hat] = estimate_MA_IPA_LPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de)
+function [e_hat,de_hat] = estimate_MA_IPA_LPA(x,AR,ICA,ISA,unknown_dimensions,de)
 %Estimates the complete MA-IPA model. Method: LPA (Linear Predictive Approximation, =AR fit) + ISA on the estimated innovation.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ARmethod_parameters:
-%      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
-%      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   AR: AR estimator, see 'estimate_AR.m'.
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %AR fit to x (x_innovation_hat):    
     num_of_samples = size(x,2);
     Lmax = max(1,floor(const*num_of_samples^(1/3-epsi)));%o(T^{1/3}) to guarantee consistency
-    ARmethod_parameters.L = [1:Lmax];
-    [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,ARmethod_parameters);    
+    AR.L = [1:Lmax];
+    [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,AR);    
     
 %ISA on the estimated innovation of x (e_hat,de_hat):
-   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x_innovation_hat,1));
+   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA,ISA,unknown_dimensions,de,size(x_innovation_hat,1));
     

code/IPA/demos/estimate_PNL_ISA.m

-function [W_hat,e_hat,de_hat] = estimate_PNL_ISA(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,gaussianizationmethod_parameters)
+function [W_hat,e_hat,de_hat] = estimate_PNL_ISA(x,ICA,ISA,unknown_dimensions,de,gaussianization)
 %Estimates the PNL-ISA model. Method: gaussianization + ISA on the result of the gaussianization.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %       2)in case of 'unknown_dimensions = 1': the length of 'de' must be equal to the number of subspaces, but the coordinates of the vector can be arbitrary.
-%   gaussianizationmethod_parameters: gaussianization method and its parameters, see 'estimate_gaussianization.m'
+%   gaussianization: gaussianization method and its parameters, see 'estimate_gaussianization.m'
 %OUTPUT:
 %   e_hat: e_hat(:,t) is the estimated ISA source at time t.
 %   W_hat: estimated ISA demixing matrix.
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %gaussianization:       
-    [x_LIN_hat] = estimate_gaussianization(x,gaussianizationmethod_parameters);
+    [x_LIN_hat] = estimate_gaussianization(x,gaussianization);
     
 %(linear) ISA:
-    [e_hat,W_hat,de_hat] = estimate_ISA(x_LIN_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x_LIN_hat,1));        
+    [e_hat,W_hat,de_hat] = estimate_ISA(x_LIN_hat,ICA,ISA,unknown_dimensions,de,size(x_LIN_hat,1));        

code/IPA/demos/estimate_complex_ICA.m

-function [e_hat,W_hat]  = estimate_complex_ICA(x,ICA_method)
+function [e_hat,W_hat]  = estimate_complex_ICA(x,ICA)
 %Performs complex ICA on signal x.
 %
 %INPUT:
 %   x: x(:,t) is the t^th sample.
-%   ICA_method: the method to perform ICA with.
+%   ICA: the method to perform independent component analysis. Possibilities: 
+%      1)ICA.opt_type = 'fastICA'.
+%      2)ICA.opt_type = 'EASI'.
 %OUTPUT:
 %   e_hat: estimated ICA elements; e_hat(:,t) is the t^th sample.
 %   W_hat: estimated ICA demixing matrix.
 
 disp('Complex ICA estimation: started.');
 
-switch ICA_method
+switch ICA.opt_type
     case 'fastICA'
             [x,W_whiten] = complex_whiten(x);
             defl = 1;%1:deflation, 0:no deflation.

code/IPA/demos/estimate_complex_ISA.m

-function [e_real_hat,W_real_hat,de_real_hat,e_hat] = estimate_complex_ISA(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,dim_reduction)
+function [e_real_hat,W_real_hat,de_real_hat,e_hat] = estimate_complex_ISA(x,ICA,ISA,unknown_dimensions,de,dim_reduction)
 %Estimates the complex ISA model. Method: complex -> real transformation, +real ISA.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
     dim_reduction_real = 2 * dim_reduction;
     
 %ISA on the transformed data:    
-    [e_real_hat,W_real_hat,de_real_hat] = estimate_ISA(x_real,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de_real,dim_reduction_real);
+    [e_real_hat,W_real_hat,de_real_hat] = estimate_ISA(x_real,ICA,ISA,unknown_dimensions,de_real,dim_reduction_real);
     
 %real->complex transformation:    
     e_hat = R2C_vector(e_real_hat);

code/IPA/demos/estimate_complex_ISA_C.m

-function [e_hat,W_hat,de_hat] = estimate_complex_ISA_C(x,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de)
+function [e_hat,W_hat,de_hat] = estimate_complex_ISA_C(x,ICA,ISA,unknown_dimensions,de)
 %Estimates the complex ISA model. Method: complex ICA + clustering of the complex ICA elements (=complex ISA separation theorem).
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_complex_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 disp('Complex ISA estimation: started.');
 
 %complex ICA step (e_ICA,W_ICA):
-    [e_ICA,W_ICA] = estimate_complex_ICA(x,ICA_method);
+    [e_ICA,W_ICA] = estimate_complex_ICA(x,ICA);
 
 %clustering of the ICA elements (=permutation search; perm_ICA):
     switch unknown_dimensions
         case 0
-            perm_ICA = clustering_UD0(e_ICA,de,opt_type,cost_type,cost_name);%'UD0': unknown_dimensions=0
+            perm_ICA = clustering_UD0(e_ICA,de,ISA.opt_type,ISA.cost_type,ISA.cost_name);%'UD0': unknown_dimensions=0
             de_hat = de;%it is given
         case 1
-            if strcmp(cost_type,'Ipairwise1d')
-                [perm_ICA,de_hat] = clustering_UD1(e_ICA,num_of_comps,opt_type,cost_name);%'UD1': unknown_dimensions=1                    
+            if strcmp(ISA.cost_type,'Ipairwise1d')
+                [perm_ICA,de_hat] = clustering_UD1(e_ICA,num_of_comps,ISA.opt_type,ISA.cost_name);%'UD1': unknown_dimensions=1                    
                 [de_hat,per] = sort_subspaces_dimensions(de_hat);%sort the subspaces in increasing order with respect to their dimensions.
                 perm_ICA = perm_ICA(per);%apply permutation 'perm_ICA' and then permutation 'per'
             else

code/IPA/demos/estimate_fAR.m

-function [E] = estimate_fAR(x,L,fARmethod_parameters)
-%Estimates fAR model of order L. Method: estimate the "u_t = [x_{t-1},...,x_{t-L}] -> v_t = x_t" mapping nonparametrically.
+function [E] = estimate_fAR(x,fAR)
+%Estimates fAR model of order fAR.L (=:L). Method: estimate the "u_t = [x_{t-1},...,x_{t-L}] -> v_t = x_t" mapping nonparametrically.
 %
 %INPUT:
 %   x: x(:,t) is the t^th observation from the fAR model.
-%   L: fAR order.
-%   fARmethod_parameters: parameters of the fAR estimator, see 'recursive_Nadaraya_Watson_estimator.m'.
+%   fAR: fAR estimator,
+%      fAR.method: method used for fAR identification. Possibilities: 'recursiveNW'.
+%      fAR.L: fAR order, see also 'recursive_Nadaraya_Watson_estimator.m'.
 %OUTPUT:
 %   E: estimated innovation, E(:,t) at time t.
 %
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
-switch fARmethod_parameters.method
+%initialization:
+    L = fAR.L;
+
+switch fAR.method
     case 'recursiveNW'
         U = concatenation(x(:,1:end-1),L);
         V = x(:,L+1:end);
-        E = V - recursive_Nadaraya_Watson_estimator(U,V,fARmethod_parameters);
+        E = V - recursive_Nadaraya_Watson_estimator(U,V,fAR);
     otherwise
         error('fAR fit method=?');
 end

code/IPA/demos/estimate_fAR_IPA.m

-function [e_hat,W_hat,de_hat,s_hat] = estimate_fAR_IPA(x,L,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,fARmethod_parameters)
+function [e_hat,W_hat,de_hat,s_hat] = estimate_fAR_IPA(x,ICA,ISA,unknown_dimensions,de,fAR)
 %Estimates the fAR-IPA model. Method: fAR identification + ISA on the estimated innovation.
 %
 %INPUT:
 %   x: x(:,t) is the t^th observation from the fAR model.
-%   L: fAR order.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %       2)in case of 'unknown_dimensions = 1': the length of 'de' must be equal to the number of subspaces, but the coordinates of the vector can be arbitrary.
-%   fARmethod_parameters: parameters of the fAR estimator, see 'estimate_fAR.m'.
+%   fAR: fAR estimator, see 'estimate_fAR.m'.
 %OUTPUT:
 %   e_hat: e_hat(:,t) is the estimated driving noise at time t.
 %   W_hat: estimated demixing matrix.
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %fAR identification, estimated innovation (x_innovation_hat):
-    x_innovation_hat = estimate_fAR(x,L,fARmethod_parameters);
+    x_innovation_hat = estimate_fAR(x,fAR);
     
 %ISA on the estimated innovation:    
-    [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x_innovation_hat,1));
+    [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA,ISA,unknown_dimensions,de,size(x_innovation_hat,1));
     
 %estimated source (s_hat):    
     s_hat = W_hat * x;    

code/IPA/demos/estimate_gaussianization.m

-function [s_transformed] = estimate_gaussianization(s,gaussianizationmethod_parameters)
+function [s_transformed] = estimate_gaussianization(s,gaussianization)
 %Gaussianization of the variable s using rank technique.
 %
 %INPUT:
 %   s(:,t): variable at time t.
-%   method: gaussianization method. Possibilities: 'rank'.
+%   gaussianization.method: gaussianization method. Possibilities: 'rank'.
 %OUTPUT:
 %   s_transformed(:,t): variable at time t, transformed variant of s.
 %
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
-method = gaussianizationmethod_parameters.method;
+method = gaussianization.method;
 
 %initialize method-specific parameters:
     switch method
         case 'rank'
-            c = gaussianizationmethod_parameters.c;
+            c = gaussianization.c;
         otherwise
             error('Gaussianization method=?');
     end

code/IPA/demos/estimate_mAR.m

-function [x_innovation_hat,Fx_hat,SBCs] = estimate_mAR(x,ARmethod_parameters)
+function [x_innovation_hat,Fx_hat,SBCs] = estimate_mAR(x,mAR)
 %mAR estimation on observation x.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t. x(i,t)=NaN means 'not available' observation.
-%   ARmethod_parameters.method: 
-%      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_mAR.m'.
-%      AR estimation method. Possibilities: 'NIW' (see 'estimate_mAR_NIW.m'), 'subspace', 'subspace-LL', 'LL' (see 'estimate_mAR_E4.m') 
+%   mAR: 
+%      mAR.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_mAR.m'.
+%      mAR.method. Possibilities: 'NIW' (see 'estimate_mAR_NIW.m'), 'subspace', 'subspace-LL', 'LL' (see 'estimate_mAR_E4.m') 
 %OUTPUT:
 %   x_innovation_hat: estimated innovation of x, at time t it is x_innovation_hat(:,t).
 %   Fx_hat: estimated dynamics of x.
-%   SBCs: computed SBC values if ARmethod_parameters.L is vector, else SBCs = [].
+%   SBCs: computed SBC values if AR.L is a vector, else SBCs = [].
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %initialization:
-    method = ARmethod_parameters.method;
-    Ls = ARmethod_parameters.L;
+    method = mAR.method;
+    Ls = mAR.L;
     [D,num_of_samples] = size(x);
     disp(strcat(['mARfit: started (method: ',method,').']));
     switch method
                 x_innovation_hat = x_innovation_hat_opt;
                 Fx_hat = Fx_hat_opt;
             end
-        case 'stepwiseLS'%ARfit Matlab package
-            %Fx_hat:
-                Ls,
-                [w_temp, Fx_hat, C, SBCs] = arfit(x.', min(Ls), max(Ls));
-                %[w_temp, Fx_hat, C, SBCs] = arfit(x.', min(Ls), max(Ls),'sbc','zero');
-            %x_innovation_hat:
-                [siglev,x_innovation_hat] = arres(w_temp, Fx_hat, x.');
-                x_innovation_hat = x_innovation_hat.';
         otherwise
             error('mAR identification method=?');
     end
-disp('mARfit: ready.');    
+disp('mARfit: ready.');    

code/IPA/demos/estimate_mAR_IPA.m

-function [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_mAR_IPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de)
+function [s_hat,e_hat,W_hat,de_hat,Fx_hat,Fs_hat] = estimate_mAR_IPA(x,mAR,ICA,ISA,unknown_dimensions,de)
 %Estimates the mAR-IPA model. Method: mAR identification + ISA on the estimated innovation.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ARmethod_parameters:
-%      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   mAR: mAR estimator, see 'estimate_mAR.m'.
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
 %mAR fit to x:
-   [x_innovation_hat,Fx_hat,SBCs] = estimate_mAR(x,ARmethod_parameters);
+   [x_innovation_hat,Fx_hat,SBCs] = estimate_mAR(x,mAR);
    
 %ISA on the estimated innovation of x:
-   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,size(x_innovation_hat,1));
+   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA,ISA,unknown_dimensions,de,size(x_innovation_hat,1));
    
 %estimated source(s_hat) and its dynamics:
    s_hat = W_hat * x;

code/IPA/demos/estimate_uMA_IPA_LPA.m

-function [e_hat,W_hat,de_hat] = estimate_uMA_IPA_LPA(x,ARmethod_parameters,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,De)
+function [e_hat,W_hat,de_hat] = estimate_uMA_IPA_LPA(x,AR,ICA,ISA,unknown_dimensions,de,De)
 %Estimates the uMA-IPA model. Method: LPA (Linear Predictive Approximation, =AR fit) + ISA on the estimated innovation.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
-%   ARmethod_parameters:
-%      ARmethod_parameters.L: AR order; can be vector, too; in that case the 'best' AR order is chosen according to SBC, see 'estimate_AR.m'.
-%      ARmethod_parameters.method: AR estimation method. Possibilities: 'NIW', 'subspace', 'subspace-LL', 'LL'.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   AR: AR estimator, see 'estimate_AR.m'.
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
    
 %AR fit to x:
-   [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,ARmethod_parameters);
+   [x_innovation_hat,Fx_hat,SBCs] = estimate_AR(x,AR);
    
 %ISA on the estimated innovation of x:
-   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,De);
+   [e_hat,W_hat,de_hat] = estimate_ISA(x_innovation_hat,ICA,ISA,unknown_dimensions,de,De);
 

code/IPA/demos/estimate_uMA_IPA_TCC.m

-function [e_hat,W_hat,de_hat,L2] = estimate_uMA_IPA_TCC(x,L,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de)
+function [e_hat,W_hat,de_hat,L2] = estimate_uMA_IPA_TCC(x,L,ICA,ISA,unknown_dimensions,de)
 %Estimates the uMA-IPA model. Method: temporal concatenation of the observations + ISA.
 %
 %INPUT:
 %   x: x(:,t) is the observation at time t.
 %   L: length of the convolution; H_0,...,H_{L}: L+1 H_j matrices.
-%   ICA_method: the name of the ICA method applied, see 'estimate_ICA.m'.
-%   cost_type, cost_name, opt_type: cost type, cost name, optimization  type. Example: cost_type = 'sumH', cost_name = 'Renyi_kNN_1tok', opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
+%   ICA: solver for independent component analysis, see 'estimate_ICA.m'.
+%   ISA: solver for independent subspace analysis (=clustering of the ICA elements). ISA.cost_type, ISA.cost_name, ISA.opt_type: cost type, cost name, optimization type. Example: ISA.cost_type = 'sumH', ISA.cost_name = 'Renyi_kNN_1tok', ISA.opt_type = 'greedy' means that we use an entropy sum ISA formulation ('sumH'), where the entropies are estimated Renyi entropies via kNN methods ('Renyi_kNN_1tok') and the optimization is greedy; see also 'demo_ISA.m'
 %   unknown_dimensions: '0' means 'the subspace dimensions are known'; '1' means 'the number of the subspaces are known' (but the individual dimensions are unknown).
 %   de: 
 %       1)in case of 'unknown_dimensions = 0': 'de' contains the subspace dimensions.
         end
      %de:
         de = kron(de,ones(L+L2,1));%de is a column vector
-    [e_hat,W_hat,de_hat] = estimate_ISA(X,ICA_method,opt_type,cost_type,cost_name,unknown_dimensions,de,dim_reduction);
+    [e_hat,W_hat,de_hat] = estimate_ISA(X,ICA,ISA,unknown_dimensions,de,dim_reduction);
    

code/shared/MA_polynomial.m

-function [H] = MA_polynomial(Dx,De,MAparameters)
+function [H] = MA_polynomial(Dx,De,MA)
 %Generates a MA polynomial H[z] of dimension Dx x De (H[z]\in R[z]^{Dx x De}) and length L, to an uMA-IPA/complete MA-IPA problem.
 %
 %INPUT: 
-%   MAparameters.type: 
+%   MA.type: 
 %      1)'randn' ('rand'): the coordinates of H are i.i.d. standard normal (uniform = U[0,1]) variables.
 %      2)'stable': H corresponds to the polynomail matrix H[z] = [\prod_{k=1}^L(I-lambda O_i z^{-1})]H_0: O_i,H_0:random orthogonal, |MAparameters.lambda|<1.
-%   MAparameters.L: length of the convolution; H_0,...,H_{L}: L+1 H_j matrices.
+%   MA.L: length of the convolution; H_0,...,H_{L}: L+1 H_j matrices.
 %OUTPUT:
 %   H = [H_0,...,H_L], where H[z] = \sum_{l=0}^L H_l z^{-l}.
 %EXAMPLE:
 %   %uMA-IPA:
-%      MAparameters.type = 'randn';
-%      H = MA_polynomial(4,2,3,MAparameters);
+%      MA.type = 'randn';
+%      H = MA_polynomial(4,2,3,MA);
 %   %MA-IPA:
-%       MAparameters.type = 'stable';
-%       MAparameters.lambda = 0.4;
-%       H = MA_polynomial(4,2,3,MAparameters);
+%       MA.type = 'stable';
+%       MA.lambda = 0.4;
+%       H = MA_polynomial(4,2,3,MA);
 %
 %Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
 %
 %
 %You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
 
-switch MAparameters.type
+switch MA.type
     case 'randn'%uMA-IPA
-        H = randn(Dx,De*(MAparameters.L+1));
+        H = randn(Dx,De*(MA.L+1));
     case 'rand'%uMA-IPA
-        H = rand(Dx,De*(MAparameters.L+1));  
+        H = rand(Dx,De*(MA.L+1));  
     case 'stable'%MA-IPA (Dx=De)
         if Dx==De
-            MA_L = MAparameters.L;
-            MA_lambda = MAparameters.lambda;
+            MA_L = MA.L;
+            MA_lambda = MA.lambda;
             %Hz = [\prod_{k=1}^L(I-lambda O_i z^{-1})]: O_i:random orthogonal.
                 Hz = {eye(De)};            
                 for k = 1 : MA_L

code/shared/embedded/KDP/src/kdpee.c

 //#include <stdbool.h>
 typedef int bool;
 
-//natural logarithm of 2, used in Log2.
-const double ln_2 =  0.693147181;
-
-//Calculates log2 of a number. You need this function in case of using Microsoft (Microsoft does not provide log2...), otherwise you can use the available log2 function.
-double Log2( double n )  
-{  
-    return log(n) / ln_2; // log(n)/log(2) is log2(n)
+#ifdef _MSC_VER
+const double loge_2 =  0.693147181; //natural logarithm of 2
+// Calculates log2 of a number. You need this function in case of using Microsoft (Microsoft does not provide log2...).
+    double log2( double n ){
+    return log(n) / loge_2; // log(n)/log(2) is log2(n)
 }
+#endif
 
 #ifndef false
 	#define false 0
 
 floatval kdpee(const floatval **dimrefs, const int n, const int d, floatval *mins, 
 								floatval *maxs, const floatval zcut, int *keys){
-	//int minlev = ceil(0.5 * log2(n)); // Min partitioning level
-    int minlev = ceil(0.5 * Log2(n)); // Min partitioning level (Microsoft does not provide log2...)
+	int minlev = ceil(0.5 * log2(n)); // Min partitioning level
 	int i;
-    floatval result;
-    
+	floatval result;
+
 	for(i = 0; i < n; ++i){
 		keys[i] = i; // initialise keys
 	}
 	// As well as returning the median, this PARTITIONS the data (keys) in-place
 	floatval median = kdpee_hoareMedian(dimrefs[dimno], keys, minindex, maxindex);
 	floatval zscore;
-    
+
 	if(curlev == minlev){
 		mayTerminate = true; // We have passed the lower termination depth
 	}
-	
 
 	if(mayTerminate){
 		zscore = (sqrt(thesize) * (median+median-mins[dimno]-maxs[dimno]) 
 		// We need to partition and recurse
 		floatval oldextremum;
 		floatval left,right;
-        
+
 		int newmaxindex, newminindex;
 		if((thesize & 1) == 0){ // even # points
 			newmaxindex = minindex + thesize/2 - 1;
 		// Remember the outer extremum, replace with median, then recurse
 		oldextremum = maxs[dimno];
 		maxs[dimno] = median;
-		left = kdpee_recurse(dimrefs, n, d, mins, 
-								maxs, zcut, keys,
+		left = kdpee_recurse(dimrefs, n, d, mins, maxs, zcut, keys,
 								mayTerminate, curlev+1, n_rec,
 								minindex, newmaxindex, minlev
 								);
 		// Remember the outer extremum, replace with median, then recurse
 		oldextremum = mins[dimno];
 		mins[dimno] = median;
-		right = kdpee_recurse(dimrefs, n, d, mins, 
-								maxs, zcut, keys,
+		right = kdpee_recurse(dimrefs, n, d, mins, maxs, zcut, keys,
 								mayTerminate, curlev+1, n_rec,
 								newminindex, maxindex, minlev
 								);

code/shared/recursive_Nadaraya_Watson_estimator.m

-function [V_hat] = recursive_Nadaraya_Watson_estimator(U,V,fARmethod_parameters)
+function [V_hat] = recursive_Nadaraya_Watson_estimator(U,V,fAR)
 %Recursive Nadaraya-Watson nonparametric estimator.
 %
 %INPUT:
 %   U: U(:,t) is the t^th input sample.
 %   V: V(:,t) is the t^th output sample. The task is to learn the g: U(:,t) -> V(:,t) mapping.
-%   fARmethod_parameters: parameters of the recursive_Nadaraya_Watson_estimator:
-%       fARmethod_parameters.beta_normalized \in (0,1).
+%   fAR: parameters of the recursive Nadaraya Watson estimator; fAR.beta_normalized \in (0,1).
 %OUTPUT:
 %   V_hat: \hat{g}(U)
 %
 
 %initialization:
     [D,T] = size(V);%T=size(U,2)
-    bet = fARmethod_parameters.beta_normalized / D; 
+    bet = fAR.beta_normalized / D; 
     V_hat = zeros(size(V));
     c1 = [1:T].^(bet*D); %row vector
     c2G =  [1:T].^(2*bet); %row vector; 'G'<->'Gaussian'
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.