Commits
Comments (0)
Files changed (31)

+4 0CHANGELOG.txt

+12 3README.md

+1 1code/H_I_D_C/base_estimators/CCE_kNN_k_estimation.m

+1 1code/H_I_D_C/base_estimators/DBhattacharyya_kNN_k_estimation.m

+37 0code/H_I_D_C/base_estimators/DCS_KDE_iChol_estimation.m

+30 0code/H_I_D_C/base_estimators/DCS_KDE_iChol_initialization.m

+37 0code/H_I_D_C/base_estimators/DED_KDE_iChol_estimation.m

+30 0code/H_I_D_C/base_estimators/DED_KDE_iChol_initialization.m

+1 1code/H_I_D_C/base_estimators/DHellinger_kNN_k_estimation.m

+1 1code/H_I_D_C/base_estimators/DKL_kNN_k_estimation.m

+1 1code/H_I_D_C/base_estimators/DKL_kNN_kiTi_estimation.m

+1 1code/H_I_D_C/base_estimators/DL2_kNN_k_estimation.m

+2 2code/H_I_D_C/base_estimators/DMMDonline_estimation.m

+1 1code/H_I_D_C/base_estimators/DRenyi_kNN_k_estimation.m

+1 1code/H_I_D_C/base_estimators/DTsallis_kNN_k_estimation.m

+32 0code/H_I_D_C/base_estimators/IQMI_CS_KDE_direct_estimation.m

+31 0code/H_I_D_C/base_estimators/IQMI_CS_KDE_direct_initialization.m

+31 0code/H_I_D_C/base_estimators/IQMI_CS_KDE_iChol_estimation.m

+31 0code/H_I_D_C/base_estimators/IQMI_CS_KDE_iChol_initialization.m

+31 0code/H_I_D_C/base_estimators/IQMI_ED_KDE_iChol_estimation.m

+31 0code/H_I_D_C/base_estimators/IQMI_ED_KDE_iChol_initialization.m

+27 0code/shared/embedded/ITL/d_cs.m

+27 0code/shared/embedded/ITL/d_ed.m

+78 0code/shared/embedded/ITL/incompleteCholesky.m

+83 0code/shared/embedded/ITL/incompleteCholeskyMulti.m

+25 0code/shared/embedded/ITL/incompleteCholeskySigma.m

+674 0code/shared/embedded/ITL/license.txt

+28 0code/shared/embedded/ITL/qmi_cs.m

+44 0code/shared/embedded/ITL/qmi_cs_2.m

+28 0code/shared/embedded/ITL/qmi_ed.m

+35 0code/shared/embedded/ITL/readme.txt
README.md
ITE can estimate Shannon, R�nyi, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon, L2, R�nyi, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa; complex variants of entropy and mutual information; L2, R�nyi, Tsallis, KullbackLeibler divergence; Hellinger, Bhattacharyya distance; maximum mean discrepancy, Jdistance, and crossentropy.
+ `mutual information`: generalized variance, kernel canonical correlation analysis, kernel generalized variance, HilbertSchmidt independence criterion, Shannon mutual information, L2 mutual information, R�nyi mutual information, Tsallis mutual information, copulabased kernel dependency, multivariate version of Hoeffding's Phi, SchweizerWolff's sigma and kappa, complex mutual information, CauchySchwartz quadratic mutual information, Euclidean distance based quadratic mutual information,
+ `divergence`: KullbackLeibler divergence, L2 divergence, R�nyi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy, Jdistance, CauchySchwartz divergence, Euclidean distance based divergence,
 its extensions to different linear, controlled, post nonlinear, complex valued, partially observed models, as well as to systems with nonparametric source dynamics.
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.21_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.21_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.22_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.22_code.tar.bz2),
code/H_I_D_C/base_estimators/DCS_KDE_iChol_estimation.m
+%Estimates the CauchySchwartz divergence using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition. The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken. Cost parameters are provided in the cost object co.
+%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
+%REFERENCE: Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 28832887, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
code/H_I_D_C/base_estimators/DCS_KDE_iChol_initialization.m
+%Initialization of the CauchySchwartz divergence estimator using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+% 2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/DED_KDE_iChol_estimation.m
+%Estimates the Euclidean distance based divergence using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition. The number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] must be equal; otherwise their minimum is taken. Cost parameters are provided in the cost object co.
+%We make use of the naming convention 'D<name>_estimation', to ease embedding new divergence estimation methods.
+%REFERENCE: Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 28832887, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
code/H_I_D_C/base_estimators/DED_KDE_iChol_initialization.m
+%Initialization of the Euclidean distance based divergence estimator using Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+% 2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/DMMDonline_estimation.m
 warning('There must be equal number of samples in Y1 and Y2. Minimum of the sample numbers has been taken.');
+ warning('There must be equal number of samples in Y1 and Y2 for this estimator. Minimum of the sample numbers has been taken.');
code/H_I_D_C/base_estimators/IQMI_CS_KDE_direct_estimation.m
+%Estimates the CauchySchwartz quadratic mutual information (I) directly based on Gaussian KDE (kernel density estimation).
+% Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 28832887, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/IQMI_CS_KDE_direct_initialization.m
+%Initialization of the CauchySchwartz quadratic mutual information (QMICS) direct estimator based on Gaussian KDE (kernel density estimation).
+% 2)We make use of the naming convention 'I<name>_initialization', to ease embedding new mutual information estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/IQMI_CS_KDE_iChol_estimation.m
+%Estimates the CauchySchwartz quadratic mutual information (I) approximately, applying Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+% Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 28832887, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/IQMI_CS_KDE_iChol_initialization.m
+%Initialization of the CauchySchwartz quadratic mutual information (QMICS) approximate estimator, applying Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+% 2)We make use of the naming convention 'I<name>_initialization', to ease embedding new mutual information estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/IQMI_ED_KDE_iChol_estimation.m
+%Estimates the Euclidean distance based quadratic mutual information (I) approximately, applying Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+% Sohan Seth and Jose C. Principe. On speeding up computation in information theoretic learning. In International Joint Conference on Neural Networks (IJCNN), pages 28832887, 2009.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_C/base_estimators/IQMI_ED_KDE_iChol_initialization.m
+%Initialization of the Euclidean distance based quadratic mutual information (QMIED) approximate estimator, applying Gaussian KDE (kernel density estimation) and incomplete Cholesky decomposition.
+% 2)We make use of the naming convention 'I<name>_initialization', to ease embedding new mutual information estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
CHANGELOG.txt