Machine Learning Garden

Powered by 🌱Roam Garden

Cross-Entropy

References

Tags: concept

Sources:

Updates:

April 20th, 2021: created note.

Notes {{word-count}}

Summary:

Key points:

Cross-Entropy measures how similar two distributions pθp_\theta and pp are.

H(p,pθ)=−∑yp(y∣xi)log⁡pθ(y∣xi)H\left(p, p_{\theta}\right)=-\sum_{y} p\left(y \mid x_{i}\right) \log p_{\theta}\left(y \mid x_{i}\right)

If we assume yi∼p(y∣xi)y_{i} \sim p\left(y \mid x_{i}\right), meaning the label is sampled from the true distribution, then H(p,pθ)≈−log⁡pθ(yi∣xi)H\left(p, p_{\theta}\right) \approx-\log p_{\theta}\left(y_{i} \mid x_{i}\right).

Cross-Entropy