Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

finding specific conditional entropy (for calculating informationgain)

Status
Not open for further replies.

ssjalakazam

Programmer
Apr 15, 2008
1
US
I am trying to learn how to calculate information gain, and have hit a brick wall. Gain(Y,X) = entropy(Y) - entropy(Y|X)

The first term, entropy(y), is easy.

But, the entropy(Y|X) is the problem...


so entropy(Y|X) = SUM (prob[x] * entropy(Y|X = x)), over all values of Y.

But what is entropy(Y|X=x)? How do you find it? I have seen nothing online that explains this, and it seems crucial to calculating information gain.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top