 function [co] = IShannon_HShannon_initialization(mult)
%Initialization of the "meta" Shannon mutual information estimator, which is
% 1)based on an(y) estimator for Shannon differential entropy,
% 2)is treated as a cost object (co).
%Mutual information is estimated using the relation:
% (*) I(y^1,...,y^M) = \sum_{m=1}^MH(y^m)  H([y^1,...,y^M]).
%
%Here, we make use of the naming convention: 'I<name>_initialization', to ease embedding new mutual information estimation methods.
%
%INPUT:
% mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
%OUTPUT:
% co: object (structure).
%
%Copyright (C) 2012 Zoltan Szabo ("http://nipg.inf.elte.hu/szzoli", "szzoli (at) cs (dot) elte (dot) hu")
%
%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
%
%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
%
%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
%
%You should have received a copy of the GNU General Public License along with ITE. If not, see <http://www.gnu.org/licenses/>.
%mandatory fields:
co.name = 'Shannon_HShannon';
co.mul = mult;
%other fields:
co.member_name = 'Shannon_kNN_k'; %you can change it to any Shannon entropy estimator capable of 'multiplicative
%factor correct' H estimation (mult=1) below. Note: Renyi entropy (H_{R,alpha}) also gives in limit the Shannon entropy (H): H_{R,alpha} > H, as alpha > 1.
co.member_co = H_initialization(co.member_name,1);%'1': since we use the relation '(*)' (multiplicative factors in entropy estimations are NOT allowed).
