# Commits

# Comments (0)

# Files changed (6)

# CHANGELOG.txt

+-Edgeworth expansion based Shannon entropy estimator: added; see 'HShannon_Edgeworth_initialization.m', 'HShannon_Edgeworth_estimation.m'.

+-Lookup table for the underlying H/I/D estimation formulas: added (see ITE_documentation.pdf: Section C).

-The Hellinger and Bhattacharyya distances are now available in ITE. They can be estimated via k-nearest neighbor methods; see 'DHellinger_kNN_k_initialization.m', 'DHellinger_kNN_k_estimation.m', 'DBhattacharyya_kNN_k_initialization.m', and 'DBhattacharyya_kNN_k_estimation.m'.

-Monte-Carlo simulation to compute the additive constants in Renyi entropy estimation: added; see 'estimate_HRenyi_constant.m'.

-Tsallis entropy is now available in ITE; it can be estimated via k-nearest neighbors, see 'HTsallis_kNN_k_initialization.m', 'HTsallis_kNN_k_estimation.m'.

+-A '/'->'*' typo corrected in 'HRenyi_kNN_k_estimation.m'; see 'estimate_Ialpha.m' (V: volume of the unit ball).

-Schweizer-Wolff's sigma and kappa: added; see 'ISW1_initialization.m', 'ISW1_estimation.m', 'ISWinf_initialization.m', 'ISWinf_estimation.m'.

# code/H_I_D/base_estimators/HShannon_Edgeworth_estimation.m

+%Estimates the Shannon entropy (H) of Y (Y(:,t) is the t^th sample) using Edgeworth expansion. Cost parameters are provided in the cost object co.

+%We make use of the naming convention 'H<name>_estimation', to ease embedding new entropy estimation methods.

+%REFERENCE: Marc Van Hulle. Edgeworth approximation of multivariate differential entropy. Neural Computation, 17(9), 1903-1910, 2005.

+%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")

+%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by

+%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

+%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of

+%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

+%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.

+ H_whiten = log(prod(s));%we will take this scaling into account via the entropy transformation rule [ H(Wz) = H(z)+log(|det(W)|) ] at the end

+H_normal = log(det(cov(Y.')))/2 + d/2 * log(2*pi) + d/2;%Shannon entropy of a normal variable with cov(Y.') covariance.

# code/H_I_D/base_estimators/HShannon_Edgeworth_initialization.m

+%Initialization of the Edgeworth expansion based Shannon entropy estimator. The expansion is carried out around the normal distribution.

+% 2)We make use of the naming convention 'H<name>_initialization', to ease embedding new entropy estimation methods.

# code/H_I_D/utilities/Edgeworth_t1_t2_t3.m

+%Computes the three kappa_ijk := E[x_i x_j x_k] based terms (t1,t2,t3) in the Edgeworth expansion based entropy estimator, see 'HShannon_Edgeworth_estimation.m'.

# code/IPA/optimization/clustering_UD0.m

% cost_name: depending on entropy/mutual information based ISA formulation [see below A)-F)], it can take the values:

-% a)base methods: 'Shannon_kNN_k', 'Renyi_kNN_k', 'Renyi_kNN_1tok', 'Renyi_kNN_S', 'Renyi_weightedkNN', 'Renyi_MST', 'Renyi_GSF'.

+% a)base methods: 'Shannon_kNN_k', 'Renyi_kNN_k', 'Renyi_kNN_1tok', 'Renyi_kNN_S', 'Renyi_weightedkNN', 'Renyi_MST', 'Renyi_GSF', 'Tsallis_kNN_k', 'Shannon_Edgeworth'.