Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
337 views
in Technique[技术] by (71.8m points)

python - PyTorch: Higher order derivates of bivariate function

Suppose I have a bivariate function f(x,y) with domain R^d x R^d, and I have two sets of inputs X = [x1,x2,...,xm] and Y = [y1,y2,...,yn], where each xi and yj are d-dimensional vectors and I would like to compute the matrix with i,jth element [f(xi,yj)]_ij. I can do this in PyTorch by broadcasting and doing something like f(X.unsqueeze(1),Y).

What I would really like to compute is the following matrix: [d/dx d/dy f(xi,yj)]_ij, where I have assumed each xi and yi is scalar valued. How would I do this in PyTorch?

I am aware that for single input functions and for first derivatives, I can do something like (using PyTorch's autograd.grad):

def derivative(x,f):
    return grad(f(x).sum(), x, create_graph=True)[0]

Or:

def derivative(x, f):
    return grad(f(x), x, grad_outputs=torch.ones(x.shape),create_graph=True)

However, I am unable to generalise to the two-dimensional and higher derivative output case! Any help would be appreciated.

question from:https://stackoverflow.com/questions/65861971/pytorch-higher-order-derivates-of-bivariate-function

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

As shown in PyTorch's autograd docs you can calculate hessian (second order partial-derivatives) of a function and it's inputs similarly to what you did:

import torch


def pow_reducer(x):
    return x.pow(3).sum()


inputs = torch.rand(2, 2)
hessian = torch.autograd.functional.hessian(pow_reducer, inputs)

Here is the documentation for Hessian specifically.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...