HyperAI

JS Divergence Jensen-Shannon Divergence

JS divergence measures the similarity between two probability distributions. It is a variant of KL divergence and solves the asymmetric problem of KL divergence. Generally, JS divergence is symmetric and its value is between 0 and 1.

The definition is as follows:

There is a problem when measuring KL divergence and JS divergence:

If the two distributions P and Q are very far apart and have no overlap at all, then the KL divergence value is meaningless, and the JS divergence value is a constant, which means that the gradient of this point is 0.