Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
578 views
in Technique[技术] by (71.8m points)

classification - Difference between CrossEntropyLoss and NNLLoss with log_softmax in PyTorch?

When I am building a classifier in PyTorch, I have 2 options to do

  1. Using the nn.CrossEntropyLoss without any modification in the model
  2. Using the nn.NNLLoss with F.log_softmax added as the last layer in the model

So there are two approaches.

Now, what approach should anyone use, and why?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

They're the same.

If you check the implementation, you will find that it calls nll_loss after applying log_softmax on the incoming arguments.

return nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction)

Disclaimer: I specifically replied to "Now, what approach should anyone use, and why?" without knowledge about your use case.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...