Machine Learning Garden

Powered by 🌱Roam Garden

Cross-Entropy

References

Tags: concept

Sources:

Updates:

April 20th, 2021: created note.

Notes {{word-count}}

Summary:

Key points:

Cross-Entropy measures how similar two distributions pθp_\theta and pp are.

H(p,pθ)=yp(yxi)logpθ(yxi)H\left(p, p_{\theta}\right)=-\sum_{y} p\left(y \mid x_{i}\right) \log p_{\theta}\left(y \mid x_{i}\right)

If we assume yip(yxi)y_{i} \sim p\left(y \mid x_{i}\right), meaning the label is sampled from the true distribution, then H(p,pθ)logpθ(yixi)H\left(p, p_{\theta}\right) \approx-\log p_{\theta}\left(y_{i} \mid x_{i}\right).

Cross-Entropy