Binary cross-entropy loss

From GISAXS
Revision as of 15:36, 3 February 2023 by KevinYager (talk | contribs) (Created page with "The binary cross-entropy loss is given by: :<math> L = - \frac{1}{m} \sum_{i=1}^{m} \left[ y_i \cdot \log{ (\hat{y_i}) } + (1-y_i) \cdot \log{ (1-\hat{y_i}) } \right] </math>...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The binary cross-entropy loss is given by:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle L = - \frac{1}{m} \sum_{i=1}^{m} \left[ y_i \cdot \log{ (\hat{y_i}) } + (1-y_i) \cdot \log{ (1-\hat{y_i}) } \right] }

for training examples (indexed by ) where is the class label (0 or 1) and is the prediction for that example (i.e. the predicted probability that it is a positive example). Thus Failed to parse (Conversion error. Server ("https://wikimedia.org/api/rest_") reported: "Cannot get mml. Server problem."): {\displaystyle 1-{\hat {y_{i}}}} is the probability that it is a negative example.