Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get a gradient node with mxnet.jl and Julia?

I'm trying to replicate the following example from the mxnet main docs with mxnet.jl in Julia:

A = Variable('A')
B = Variable('B')
C = B * A
D = C + Constant(1)
# get gradient node.
gA, gB = D.grad(wrt=[A, B])
# compiles the gradient function.
f = compile([gA, gB])
grad_a, grad_b = f(A=np.ones(10), B=np.ones(10)*2)

The example shows how to autodiff a symoblic expression and obtain its gradients.

What is the equivalent in mxnet.jl (latest version 2016-03-07)?

like image 821
Bernhard Kausler Avatar asked Nov 08 '22 18:11

Bernhard Kausler


1 Answers

Code in MXNet.jl/src/symbolic-node.jl may be helpful for you to find answers.

I am not familiar with this package. Here is my Guess: A = mx.Variable("A") B = mx.Variable("B") C = B .* A D = C + 1 mx.normalized_gradient may be the solution to the remaining part if exists.

like image 132
Lanting Guo Avatar answered Nov 15 '22 07:11

Lanting Guo