02 February 2014 10 7K Report

The conditional Entropy definition says:

X* = {X1, X2, ...... Xn} and Y* = {Y1 ..... Ym} according to information theory that the normalized entropy function H(Y* IX*) defined by:

H(Y*I X* ) = ∑P(Xi)H(Y*| Xi) / log m

i = 1 to n

where

H(Y* IXi) = - ∑P( Yj I Xi) log P(Yj IX i)

j = 1 to m

according to table2 (in attachment) I'd like to compute the conditional entropy on

C = {Size, Engine, Colour} on D = {Max-Speed, Acceleration}

so i must compute H(D* I C*)

Can anyone tells me how can i apply the above rules to solve this problem?

Thanks a lot in advance.

More Ahmed Hamed's questions See All
Similar questions and discussions