Machine Learning Garden

Powered by 🌱Roam Garden

Loss Function

References

Tags: concept

Sources:

Updates:

Notes {{word-count}}

Summary:

Key points:

The Loss Function quantifies how bad ΞΈ\theta is.

We want the least bad ΞΈ\theta.

Negative Log-Likelihood (NLL) is sometimes also called as Cross-Entropy.

Examples

βˆ’βˆ‘ilog⁑pΞΈ(yi∣xi)-\sum_{i} \log p_{\theta}\left(y_{i} \mid x_{i}\right)

βˆ’βˆ‘iΞ΄(fΞΈ(xi)=yi)-\sum_{i} \delta\left(f_{\theta}\left(x_{i}\right)=y_{i}\right)

βˆ‘i12∣fΞΈ(xi)βˆ’yi∣2\sum_{i} \frac{1}{2}\left|f_{\theta}\left(x_{i}\right)-y_{i}\right|^{2}

Loss Function