Commits
Comments (0)
Files changed (6)

+3 0CHANGELOG.txt

+2 2README.md

+53 0code/H_I_D_A_C/base_estimators/HShannon_MaxEnt1_estimation.m

+29 0code/H_I_D_A_C/base_estimators/HShannon_MaxEnt1_initialization.m

+53 0code/H_I_D_A_C/base_estimators/HShannon_MaxEnt2_estimation.m

+29 0code/H_I_D_A_C/base_estimators/HShannon_MaxEnt2_initialization.m
README.md
 code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.32_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.32_code.tar.bz2),
+ code: [zip](https://bitbucket.org/szzoli/ite/downloads/ITE0.33_code.zip), [tar.bz2](https://bitbucket.org/szzoli/ite/downloads/ITE0.33_code.tar.bz2),
code/H_I_D_A_C/base_estimators/HShannon_MaxEnt1_estimation.m
+%Estimates the Shannon entropy (H) of Y using the maximum entropy distribution method. The used Gi functions are G1(x) = x exp(x^2/2) and G2(x) = abs(x).
+%We use the naming convention 'H<name>_estimation' to ease embedding new entropy estimation methods.
+% Aapo Hyvarinen. New approximations of differential entropy for independent component analysis and projection pursuit. In Advances in Neural Information Processing Systems (NIPS), pages 273279, 1997. (entropy approximation based on the maximum entropy distribution)
+% Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. John Wiley and Sons, New York, USA, 1991. (maximum entropy distribution)
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ H_whiten = log(s);%we will take this scaling into account via the entropy transformation rule [ H(wz) = H(z)+log(w) ] at the end
code/H_I_D_A_C/base_estimators/HShannon_MaxEnt1_initialization.m
+% 2)We use the naming convention 'H<name>_initialization' to ease embedding new entropy estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
code/H_I_D_A_C/base_estimators/HShannon_MaxEnt2_estimation.m
+%Estimates the Shannon entropy (H) of Y using the maximum entropy distribution method. The used Gi functions are G1(x) = x exp(x^2/2) and G2(x) = exp(x^2/2).
+%We use the naming convention 'H<name>_estimation' to ease embedding new entropy estimation methods.
+% Aapo Hyvarinen. New approximations of differential entropy for independent component analysis and projection pursuit. In Advances in Neural Information Processing Systems (NIPS), pages 273279, 1997. (entropy approximation based on the maximum entropy distribution)
+% Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. John Wiley and Sons, New York, USA, 1991. (maximum entropy distribution)
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
+ H_whiten = log(s);%we will take this scaling into account via the entropy transformation rule [ H(wz) = H(z)+log(w) ] at the end
code/H_I_D_A_C/base_estimators/HShannon_MaxEnt2_initialization.m
+% 2)We use the naming convention 'H<name>_initialization' to ease embedding new entropy estimation methods.
+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
CHANGELOG.txt