Cross Entropy Loss
The cross-entropy loss, also known as log loss, plays a crucial role in classification tasks, especially in logistic regression and neural networks. It measures the performance of a classification model whose output is a probability value between 0 and 1. Cross-entropy loss increases as the predicted probability diverges from the actual label, making it an effective loss function for assessing the similarity between the predicted probability distribution and the actual distribution.
The formulae to be used are as follows.
Comments
Post a Comment