Let a discrete distribution have probability function , and let a second discrete distribution have probability function . Then the relative entropy of with respect to , also called the Kullback-Leibler distance, is defined by
Although , so relative entropy is therefore not a true metric, it satisfies many important mathematical properties. For example, it is a convex function of , is always nonnegative, and equals zero only if .
Relative entropy is a very important concept in quantum information theory, as well as statistical mechanics (Qian 2000).