Cross Entropy Loss

The cross-entropy loss, also known as log loss, plays a crucial role in classification tasks, especially in logistic regression and neural networks. It measures the performance of a classification model whose output is a probability value between 0 and 1. Cross-entropy loss increases as the predicted probability diverges from the actual label, making it an effective loss function for assessing the similarity between the predicted probability distribution and the actual distribution.

The formulae to be used are as follows.

Comments

Popular posts from this blog

Naive Bayesian Classifiers - Multinomial, Bernoulli and Gaussian with Solved Examples and Laplace Smoothing

Lesson4.1 : Designing a Learning System for Checkers Problem - The Training Experience

Lesson5.9. Candidate Elimination Algorithm