I'm trying to replicate the following example from the mxnet main docs with mxnet.jl in Julia:
A = Variable('A')
B = Variable('B')
C = B * A
D = C + Constant(1)
# get gradient node.
gA, gB = D.grad(wrt=[A, B])
# compiles the gradient function.
f = compile([gA, gB])
grad_a, grad_b = f(A=np.ones(10), B=np.ones(10)*2)
The example shows how to autodiff a symoblic expression and obtain its gradients.
What is the equivalent in mxnet.jl (latest version 2016-03-07)?
Code in MXNet.jl/src/symbolic-node.jl
may be helpful for you to find answers.
I am not familiar with this package.
Here is my Guess:
A = mx.Variable("A")
B = mx.Variable("B")
C = B .* A
D = C + 1
mx.normalized_gradient
may be the solution to the remaining part if exists.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With