Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
868 views
in Technique[技术] by (71.8m points)

math - Does tensorflow use automatic or symbolic gradients?

I haven't been able to find a clear statement of whether tensorflow uses automatic or symbolic differentiation.

I skimmed the tensorflow paper and they mention automatic gradients, but it is unclear if they just mean symbolic gradients, as they also mention that it has that capability.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

TF uses automatic differentiation and more specifically reverse-mode auto differentiation.


There are 3 popular methods to calculate the derivative:

  1. Numerical differentiation
  2. Symbolic differentiation
  3. Automatic differentiation

Numerical differentiation relies on the definition of the derivative: enter image description here, where you put a very small h and evaluate function in two places. This is the most basic formula and on practice people use other formulas which give smaller estimation error. This way of calculating a derivative is suitable mostly if you do not know your function and can only sample it. Also it requires a lot of computation for a high-dim function.

Symbolic differentiation manipulates mathematical expressions. If you ever used matlab or mathematica, then you saw something like this enter image description here

Here for every math expression they know the derivative and use various rules (product rule, chain rule) to calculate the resulting derivative. Then they simplify the end expression to obtain the resulting expression.

Automatic differentiation manipulates blocks of computer programs. A differentiator has the rules for taking the derivative of each element of a program (when you define any op in core TF, you need to register a gradient for this op). It also uses chain rule to break complex expressions into simpler ones. Here is a good example how it works in real TF programs with some explanation.


You might think that Automatic differentiation is the same as Symbolic differentiation (in one place they operate on math expression, in another on computer programs). And yes, they are sometimes very similar. But for control flow statements (`if, while, loops) the results can be very different:

symbolic differentiation leads to inefficient code (unless carefully done) and faces the difficulty of converting a computer program into a single expression


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...