Toggle navigation sidebar
Toggle in-page Table of Contents
ML/DL Notes 0.0.1 documentation
ML
Basic Terms
Decision Trees
Perceptron
Optimizers
Neural Nets
Loss
Regularization
Other Algos
Imbalanced Data
Multiclass
NLP
Embeddings
Neural Embeddings
Similarity Measures
Meta
N Grams
Evaluating LMs
Beam Search
RL
Policy Gradients
Policy Gradient Derivation
Reward to Go
Reward to Go Derivation
Baselines
Unbiased State Value Proof
Math
Moore Penrose Pseudoinverse
.rst
.pdf
Contents
Binary Cross Entropy
Multiclass Cross Entropy AKA negative log likelihood loss
Excercises
Loss
Contents
Binary Cross Entropy
Multiclass Cross Entropy AKA negative log likelihood loss
Excercises
Loss
#
Binary Cross Entropy
#
\[L(y, \hat{y}) = y * log (\hat{y}) + (1-y) * log (1 - \hat{y})\]
Multiclass Cross Entropy AKA negative log likelihood loss
#
\[-log(\hat{y}_c)\]
Where c is the correct class
Excercises
#
Derive ‘em