A very simple example in tensorflow: min (x + 1)^2
where x
is a scalar. The code is:
import tensorflow as tf
x = tf.Variable(initial_value=3.0)
add = tf.add(x, 1)
y = tf.square(add)
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(y)
Then write the graph to disk:
graph = tf.get_default_graph()
writer = tf.summary.FileWriter("some/dir/to/write/events")
writer.add_graph(graph=graph)
Finally, visualizing it in tensorboard, it looks like this:
The question is, why is the "Add" node connected with the gradients? I think since I am trying to minimize y, the "Square" node should be. Is this a bug? Can anyone explain it?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…