Entropy
A measure of how spread-out a probability distribution is. H = -Σ p log p. Low entropy = concentrated on a few outcomes; high entropy = nearly uniform.
Continue
A measure of how spread-out a probability distribution is. H = -Σ p log p. Low entropy = concentrated on a few outcomes; high entropy = nearly uniform.
Continue