Main Content

One-dimensional Kullback-Leibler divergence of two independent data groups to measure class separability

`relativeEntropy`

is a function used in code generated by **Diagnostic Feature
Designer**.

calculates the one-dimensional Kullback-Leibler divergence of two independent subsets of
data set `Z`

= relativeEntropy(`X`

,`I`

)`X`

that are grouped according to the logical labels in
`I`

. The relative entropy provides a metric for ranking features
according to their ability to separate two classes of data, such as healthy and faulty
machines. The entropy calculation assumes that the data in `X`

follows a
Gaussian distribution.

Code that is generated by **Diagnostic Feature
Designer** uses `relativeEntropy`

when ranking features with this
method.

[1] Theodoridis, Sergios, and
Konstantinos Koutroumbas. *Pattern Recognition*, 175–177. 2nd ed.
Amsterdam; Boston: Academic Press, 2003.