ITE / code / H_I_D / meta_estimators / DJdistance_initialization.m

Full commit
function [co] = DJdistance_initialization(mult)
%Initialization of the symmetrised Kullback-Leibler, also called J-distance divergence estimator, defined according to the relation:
%D_J(f_1,f_2) = D(f_1,f_2)+D(f_2,f_1), where D denotes the Kullback-Leibler divergence.
%   1)The estimator is treated as a cost object (co).
%   2)We make use of the naming convention 'D<name>_initialization', to ease embedding new divergence estimation methods.
%   mult: is a multiplicative constant relevant (needed) in the estimation; '=1' means yes, '=0' no.
%   co: cost object (structure).
%Copyright (C) 2012 Zoltan Szabo ("", "szzoli (at) cs (dot) elte (dot) hu")
%This file is part of the ITE (Information Theoretical Estimators) Matlab/Octave toolbox.
%ITE is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
%the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
%This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
%MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
%You should have received a copy of the GNU General Public License along with ITE. If not, see <>.

%mandatory fields: = 'Jdistance';
    co.mult = mult;
%other fields:
    co.member_name = 'Renyi_kNN_k'; %you can change it to any Kullback-Leibler entropy estimator; the Renyi divergence (D_{R,alpha}) converges to the Shannon's (D): D_{R,alpha} -> D, as alpha-> 1.
    co.member_co = D_initialization(co.member_name,mult);