Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
598 views
in Technique[技术] by (71.8m points)

tensorflow - Holding variables constant during optimizer

I have a TensorFlow computational graph for a loss tensor L that depends on 2 tf.Variables, A and B.

I'd like to run gradient ascent on variable A (A+=gradient of L wrt A) while holding B fixed, and vice versa - running gradient ascent on B (B+=gradient of L wrt B) while holding A fixed. How do I do this?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

tf.stop_gradient(tensor) might be what you are looking for. The tensor will be treated as constant for gradient computation purposes. You can create two losses with different parts treated as constants.

The other option (and often better) would be to create 2 optimizers but explicitly optimize only subsets of variables, e.g.

train_a = tf.train.GradientDescentOptimizer(0.1).minimize(loss_a, var_list=[A])
train_b = tf.train.GradientDescentOptimizer(0.1).minimize(loss_b, var_list=[B])

and you can iterate between them on the updates.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...