ITE / code / H_I_D_A_C / meta_estimators / DJensenShannon_HShannon_initialization.m

function [co] = DJensenShannon_HShannon_initialization(mult)
%Initialization of the Jensen-Shannon divergence estimator, defined according to the relation:
%D_JS(f_1,f_2) = H(w1*y^1+w2*y^2) - [w1*H(y^1) + w2*H(y^2)], where y^i has density f_i (i=1,2), w1*y^1+w2*y^2 is the mixture distribution of y^1 and y^2 with w1,w2 positive weights, and H denotes the Shannon entropy.
%   1)The estimator is treated as a cost object (co).
%   2)We use the naming convention 'D<name>_initialization' to ease embedding new divergence estimation methods.
%   3)This is a meta method: the Shannon entropy estimator can arbitrary.
%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
%   co: cost object (structure).
%Copyright (C) 2012 Zoltan Szabo ("", "szzoli (at) cs (dot) elte (dot) hu")
%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
%You should have received a copy of the GNU General Public License along with ITE. If not, see <>.

%mandatory fields: = 'JensenShannon_HShannon';
    co.mult = mult;
%other fields:
    co.w = [1/2,1/2]; %assumption: co.w(i)>0, sum(co.w)=1
    co.member_name = 'Shannon_kNN_k'; %you can change it to any Shannon entropy estimator
    co.member_co = H_initialization(co.member_name,mult);