I am trying to learn how to calculate information gain, and have hit a brick wall. Gain(Y,X) = entropy(Y) - entropy(Y|X)
The first term, entropy(y), is easy.
But, the entropy(Y|X) is the problem...
so entropy(Y|X) = SUM (prob[x] * entropy(Y|X = x)), over all values of Y.
But what is...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.