Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
354 views
in Technique[技术] by (71.8m points)

neural network - Is gradient in the tensorflow's graph calculated incorrectly?

A very simple example in tensorflow: min (x + 1)^2 where x is a scalar. The code is:

import tensorflow as tf

x = tf.Variable(initial_value=3.0)
add = tf.add(x, 1)
y = tf.square(add)
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(y)

Then write the graph to disk:

graph = tf.get_default_graph()
writer = tf.summary.FileWriter("some/dir/to/write/events")
writer.add_graph(graph=graph)

Finally, visualizing it in tensorboard, it looks like this:

enter image description here

The question is, why is the "Add" node connected with the gradients? I think since I am trying to minimize y, the "Square" node should be. Is this a bug? Can anyone explain it?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

There is no bug involved. You just need to understand what is a gradient and know how to compute one yourself. So (x+1)^2' = 2*(x+1). Which means that you do not need to calculate (x+1)^2 to calculate the gradient. If you will zoom in the gradient part you will see that it calculated the gradient of your square and figured out that it does which part of the graph is needed there :enter image description here

Here is a more interesting and more intuitive example:

import tensorflow as tf

x = tf.Variable(initial_value=3.0)
y = tf.cos(x)

train = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(y)

with tf.Session() as sess:
    writer = tf.summary.FileWriter('logs', sess.graph)
    writer.close()

You should know that cos(x)' = - sin(x). Which means that only x is needed to calculate gradient. And this is what you see in the graph:enter image description here


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...