Model architectures 🏛

Metrics 📊

Solvers / Optimizers 🧮

Scheduler ⏰

Training parameters

Augmentations

DEPLOYMENT

Powered By GitBook

Cross-Entropy Loss

Loss used for single label classifiers

Cross-entropy loss is a widely used alternative for the squared error. It is used when node activations can be understood as representing the probability that each hypothesis might be true, i.e., when the output is a probability distribution. Thus, it is used as a loss function in neural networks with softmax activations in the output layer.

Calculation / Interpretation

Cross entropy indicates the distance between what the model believes the output distribution should be, and what the original distribution really is.

$C.E=-\sum_i^C t_i log(p_i)$

Where

$t_i$

is the true label and $p_i$

is the probability of the $i^{th}$

label.The goal for cross-entropy loss is to compare how well the probability distribution output by Softmax matches the one-hot-encoded ground-truth label of the data.

It uses the log to penalize wrong predictions with high confidence stronger.

The cross-entropy loss function comes right after the Softmax layer, and it takes in the input from the Softmax function output and the true label.

Interpretation of Cross-Entropy values:

Cross-Entropy = 0.00: Perfect predictions.

Cross-Entropy < 0.02: Great predictions.

Cross-Entropy < 0.05: On the right track.

Cross-Entropy < 0.20: Fine.

Cross-Entropy > 0.30: Not great.

Cross-Entropy > 1.00: Terrible.

Cross-Entropy > 2.00 Something is seriously broken.

Code implementation

PyTorch

1

# importing the library

2

import torch

3

import torch.nn as nn

4

5

# Cross-Entropy Loss

6

7

input = torch.randn(3, 5, requires_grad=True)

8

target = torch.empty(3, dtype=torch.long).random_(5)

9

10

cross_entropy_loss = nn.CrossEntropyLoss()

11

output = cross_entropy_loss(input, target)

12

output.backward()

13

14

print('input: ', input)

15

print('target: ', target)

16

print('output: ', output)

17

Copied!

Further resources

Last modified 3mo ago

Copy link