Loss ====== Binary Cross Entropy --------------------------- .. math:: L(y, \hat{y}) = y * log (\hat{y}) + (1-y) * log (1 - \hat{y}) Multiclass Cross Entropy AKA negative log likelihood loss --------------------------- .. math:: -log(\hat{y}_c) Where c is the correct class Excercises ------------- * Derive 'em