Skip to content

Conversation

@Popeyef5
Copy link

@Popeyef5 Popeyef5 commented Feb 6, 2021

The loss in the MNIST from scratch notebook is off by a factor of 10 due to the averaging. The averaging in NLLLoss is done over the batch, like in https://pytorch.org/docs/stable/generated/torch.nn.NLLLoss.html

No meaningful result changes, but nice for completeness.

out variable not deleted because it is used for backprop.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant