Cross entropy loss

Cross entropy loss is used in models that output probabilities between 0 and 1, usually to express the probability that an instance is a member of a specific class. As the output probability diverges from the actual label, the loss increases. For a simple case where the dataset consists of two classes, it is calculated as follows:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset