Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Julia Flux error: SGD optimiser is undefined

Tags:

julia

flux.jl

I want to use the SGD optimiser in Flux as shown in the Julia Academy tutorial for Deep Learning with Flux.jl. This the notebook they provided in which they use the SGD optimizer as:

opt = SGD(params(model))

However when I run SGD I get:

ERROR: UndefVarError: SGD not defined

This is my output when I run ?SGD:

search: SGD AMSGrad Signed signed Unsigned unsigned sigmoid issetgid logsigmoid StringIndexError isassigned significand

Couldn't find SGD
Perhaps you meant SGD, Set, Sys, GC, Some, sec, sin, sum, LSTM, csc, esc, isa, ans, abs, cis, cos, eps, ARGS, Pkg, GRU, RNN, cpu, elu, f32, f64, gpu, σ, !, !=, !== or %
  No documentation found.

  Binding SGD does not exist.

As you can see it is still showing SGD in the Perhaps you meant line.

I do not get an error when I run other optimizers also shown in the tutorial such as ADAM. I am using Flux v0.10.0

like image 235
TheComputerM Avatar asked Jan 20 '20 05:01

TheComputerM


1 Answers

The tutorial uses an outdated version of Flux.

In version v0.10.0 of Flux, Flux has deprecated usage of SGD in favor of Descent which is just a more optimised version of the Standard Gradient Descent algorithm.

More information on the Descent optimizer can be found in the documentation.

Also as a side-note, Flux no longer needs to pass params(model) into the optimizer instead it takes it as a separate argument when training.

# New Way
Flux.train!(loss, params(model), data, optimizer)
like image 97
TheComputerM Avatar answered Sep 24 '22 23:09

TheComputerM