在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):martinResearch/MatlabAutoDiff开源软件地址(OpenSource Url):https://github.com/martinResearch/MatlabAutoDiff开源编程语言(OpenSource Language):MATLAB 79.7%开源软件介绍(OpenSource Introduction):GoalThis project implements a Matlab/Octave non-intrusive forward automatic differentiation method, (wikipedia definition here) based on operator overloading. This does not provide backward mode or higher order derivatives. It enables precise and efficient computation of the Jacobian of a function. This contrasts with numerical differentiation (a.k.a finite differences) that is unprecise due to roundoff errors and that cannot exploit the sparsity of the derivatives. In contrast with most existing automatic differentiation Matlab toolboxes:
It is likely that the speed could be improved by representing Jacobian matrices by their transpose, due to the way Matlab represents internally sparse matrices. The document [1] describes a method similar to the one implemented here and could be a very valuable source to improve the code. It has been tested on Matlab 2014a and Octave 4.0.0, but the example using the anonymous function @(x)eig(x) does not work on octave as octave does not call the overloaded eig function once anonymized. Note that backward differentation (a.k.a. gradients back-propagation in deep learning) is going to me much faster than forward differentiation when the dimension of the output is small in comparison to the dimension of the input. Forward differentiation is of interest when solving a non linear least squares for examples through Levenberg-Marquardt minimization where we want to compute the full jacobian matrix of the residuals. LicenceFree BSD License Examplesmore examples can be found in ./src/AutoDiffExamples.m
>> f=@(x) [sin(x(1)) cos(x(2)) tan(x(1)*x(2))];
>> JAD=full(AutoDiffJacobianAutoDiff(f,[1,1]))
>> JFD=AutoDiffJacobianFiniteDiff(f,[1,1])
>> JAD+i*JFD % comparing visualy using complex numbers
>> fprintf('max abs difference = %e\n',full(max(abs(JAD(:)-JFD(:)))));
ans =
0.5403 + 0.5403i 0.0000 + 0.0000i
0.0000 + 0.0000i -0.8415 - 0.8415i
3.4255 + 3.4255i 3.4255 + 3.4255i
max absolute difference = 1.047984e-10
>> f=@(x) (log(x(1:end-1))-tan(x(2:end)))
>> tic; JAD=AutoDiffJacobianAutoDiff(f,0.5*ones(1,5000));timeAD=toc;
>> tic; JFD=sparse(AutoDiffJacobianFiniteDiff(f,0.5*ones(1,5000)));timeFD=toc;
>> fprintf('speedup AD vs FD = %2.1f\n',timeFD/timeAD)
>> fprintf('max abs difference = %e\n',full(max(abs(JAD(:)-JFD(:)))));
speedup AD vs FD = 192.2
max absolute difference = 5.351097e-11
>> f=@(x) sum(x.^2,3);
>> AutoDiffJacobianFiniteDiff(f,ones(2,2,2))
ans =
2.0000 0 0 0 2.0000 0 0 0
0 2.0000 0 0 0 2.0000 0 0
0 0 2.0000 0 0 0 2.0000 0
0 0 0 2.0000 0 0 0 2.0000
f=@(x) sum(x.^2,3);
AutoDiffJacobianFiniteDiff(f,ones(2,2,2))
Inoisy=zeros(100,100)*0.2;
Inoisy(30:70,30:70)=0.8
Inoisy=Inoisy+randn(100,100)*0.3
imshow(Inoisy)
epsilon=0.0001
%define the objective function using an anonymous function
func=@(I) sum(reshape((Inoisy-I).^2,1,[]))+...
sum(reshape(sqrt(epsilon+diff(I(:,1:end-1),1,1).^2+...
diff(I(1:end-1,:),1,2).^2),1,[]))
func(Inoisy)
speed=zeros(numel(Inoisy),1);
% heavy ball gradient descent, far from the best optimization method but simple
for k=1:500
[J,f]=AutoDiffJacobian(func,Inoisy);
speed=0.95*speed-0.05*J';
Inoisy(:)=Inoisy(:)+0.05*speed;
imshow(Inoisy);
end
function exampleSVM()
% create some fake data
n=30;
ndim=3;
rng(3);
x = rand(n, ndim);
y = 2 * (rand(n,1) > 0.5) - 1;
l2_regularization = 1e-4;
% define the loss function
function loss=loss_fn(weights_and_bias)
weights=weights_and_bias(1:end-1)' ;
bias=weights_and_bias(end);
margin = y .* (x*weights + bias);
loss = max(0, 1 - margin) .^ 2;
l2_cost = 0.5 * l2_regularization * weights'* weights;
loss = mean(loss) + l2_cost;
end
w_0=zeros(1,ndim);
b_0 = 0;
weights_and_bias=[w_0,b_0];
fprintf('intial loss %f\n',loss_fn(weights_and_bias))
% heavyball gradient descent
speed=zeros(numel(weights_and_bias),1);
tic
nbIter=100;
Err=zeros(1,nbIter);
for k=1:nbIter
[J,f]=AutoDiffJacobian(@loss_fn,weights_and_bias);
Err(k)=f;
speed=0.90*speed-0.1*J';
weights_and_bias(:)=weights_and_bias(:)+0.5*speed;
end
toc
fprintf('final loss %f\n',loss_fn(weights_and_bias))
figure(1)
plot(Err)
end Related projects
f=@(x) sum(x'*x)
[x,dx] = autodiff(rand(5,1),f)
f=@(x)(sum(x'*x))
f(rand(20,1))
f_grad = @(x) adiff(f, x);
f_grad(rand(20,1))
References[1] Forth, Shaun A. An Efficient Overloaded Implementation of Forward Mode Automatic Differentiation in MATLAB ACM Trans. Math. Softw. 2006 pdf |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论