#cross-entropy

Cross-entropy

Information-theoretic measure

In information theory, the cross-entropy between two probability distributions and , over the same underlying set of events, measures the average number of bits needed to identify an event drawn from the set when the coding scheme used for the set is optimized for an estimated probability distribution , rather than the true distribution .

Mon 22nd

Provided by Wikipedia

Learn More
0 searches
This keyword has never been searched before
This keyword has never been searched for with any other keyword.