10 October 2017 0 7K Report

function z = condEntropy (x, y) % Compute conditional entropy z=H(x|y) of two discrete variables x and y. % Input: % x, y: two integer vector of the same length % Output: % z: conditional entropy z=H(x|y) % Written by Mo Chen ([email protected]). assert(numel(x) == numel(y)); n = numel(x); x = reshape(x,1,n); y = reshape(y,1,n); l = min(min(x),min(y)); x = x-l+1; y = y-l+1; k = max(max(x),max(y)); idx = 1:n; Mx = sparse(idx,x,1,n,k,n); My = sparse(idx,y,1,n,k,n); Pxy = nonzeros(Mx'*My/n); %joint distribution of x and y Hxy = -dot(Pxy,log2(Pxy)); Py = nonzeros(mean(My,1)); Hy = -dot(Py,log2(Py)); % conditional entropy H(x|y) z = Hxy-Hy; z = max(0,z);

und this code about computing condition entropy and i tried to implement with my data but it not consent . for example

Mx = sparse(idx,x,1,n,k,n);

my data

x = -0.000001942084498 -0.000130445985932 -0.000703266812341 -0.000053579514111 -0.000064058620601 -0.000175743747962

More Ahmed Silik's questions See All
Similar questions and discussions