This function calculates Symmetrised Kullback - Leibler divergence (KL-Divergence) between each class. Designed for KLFDA.

KL_divergence(obj)

Arguments

obj

The KLFDA object. Users can mdify it to adapt your own purpose.

Details

Value

Returns a symmetrised version of the KL divergence between each pair of class

References

Van Erven, T., & Harremos, P. (2014). Renyi divergence and Kullback-Leibler divergence. IEEE Transactions on Information Theory, 60(7), 3797-3820.

Pierre Enel (2019). Kernel Fisher Discriminant Analysis (https://www.github.com/p-enel/MatlabKFDA), GitHub.

Author

qinxinghu@gmail.com

Note

This function is useful for extimating the loss between reduced features and the original features. It has been adopted in TSNE to determine its projection performance.