# Statistical Divergence
![[Pasted image 20240618005023.png|400]]
Divergence is the measure of how one distribution differs from another
## Kullback-Leibler (KL) Divergence
![[kldiv_viz.gif|550]]
KL divergence quantifies how much one probability distribution differs from
It is also called *relative entropy*.
The following equation can be used to represent the
$D_{KL}(p||q)=\sum^{N}_{i=1}p(x_i)\cdot \log\frac{p(x_i)}{q(x_i)}$