Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.3k views
in Technique[技术] by (71.8m points)

pytorch - volatile was removed and now had no effect use with.torch.no_grad() instread

my torch program stopped at this point I guess i can not use volatile=True
how should I change it and what is the reason to stop?
and How should I change this code?

images = Variable(images.cuda())
targets = [Variable(ann.cuda(), volatile=True) for ann in targets]

train.py:166: UserWarning: volatile was removed and now has no effect. Use with torch.no_grad(): instead.

question from:https://stackoverflow.com/questions/65950353/volatile-was-removed-and-now-has-no-effect-use-with-torch-no-grad-instead

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Variable doesn't do anything and has been deprecated since pytorch 0.4.0. Its functionality was merged with the torch.Tensor class. Back then the volatile flag was used to disable the construction of the computation graph for any operation which the volatile variable was involved in. Newer pytorch has changed this behavior to instead use with torch.no_grad(): to disable construction of the computation graph for anything in the body of the with statement.

What you should change will depend on your reason for using volatile in the first place. No matter what though you probably want to use

images = images.cuda()
targets = [ann.cuda() for ann in targets]

During training you would use something like the following so that the computation graph is created (assuming standard variable names for model, criterion, and optimizer).

output = model(images)
loss = criterion(images, targets)
optimizer.zero_grad()
loss.backward()
optimizer.step()

Since you don't need to perform backpropagation during evaluation you would use with torch.no_grad(): to disable the creation of the computation graph which reduces the memory footprint and speeds up computation.

with torch.no_grad():
    output = model(images)
    loss = criterion(images, targets)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...