Loss#

Binary Cross Entropy#

\[L(y, \hat{y}) = y * log (\hat{y}) + (1-y) * log (1 - \hat{y})\]

Multiclass Cross Entropy AKA negative log likelihood loss#

\[-log(\hat{y}_c)\]

Where c is the correct class

Excercises#

  • Derive ‘em