KL_divergence.Rd
This function calculates Symmetrised Kullback - Leibler divergence (KL-Divergence) between each class. Designed for KLFDA.
KL_divergence(obj)
obj | The KLFDA object. Users can mdify it to adapt your own purpose. |
---|
Returns a symmetrised version of the KL divergence between each pair of class
Van Erven, T., & Harremos, P. (2014). Renyi divergence and Kullback-Leibler divergence. IEEE Transactions on Information Theory, 60(7), 3797-3820.
Pierre Enel (2019). Kernel Fisher Discriminant Analysis (https://www.github.com/p-enel/MatlabKFDA), GitHub.
qinxinghu@gmail.com
This function is useful for extimating the loss between reduced features and the original features. It has been adopted in TSNE to determine its projection performance.