×

Differential entropy

Differential entropy is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of entropy, a measure of average of a random variable, to continuous probability distributions. Wikipedia