References
Notes {{word-count}}
Summary:
Key points:
The Loss Function quantifies how bad is.
We want the least bad .
Negative Log-Likelihood (NLL) is sometimes also called as Cross-Entropy.
The Loss Function quantifies how bad is.
Loss Function measures if one model in the model class is better than another.