Question about TensorFlow:
I was looking at the video and model on the site, and it appeared to only have SGD as an algorithm for machine learning. I was wondering if other algorithms are also included in tensorflow, such as L-BFGS.
Thank you for your responses.
Stochastic gradient descent is a very popular and common algorithm used in various Machine Learning algorithms, most importantly forms the basis of Neural Networks.
Why SGD works? The key concept is we don't need to check all the training examples to get an idea about the direction of decreasing slope. By analyzing only one example at a time and following its slope we can reach a point that is very close to the actual minimum.
Stochastic gradient descent (SGD) is widely used in deep learning due to its computational efficiency, but a complete understanding of why SGD performs so well remains a major challenge.
Common examples of algorithms with coefficients that can be optimized using gradient descent are Linear Regression and Logistic Regression.
TensorFlow's jargon for the algorithms such as Stochastic Gradient Descent (SGD) is optimizer. Following are the optimizers supported by TensorFlow:
You can also use the TensorFlow SciPy Optimizer interface which gives you access to optimizers like the L-BFGS.
Further, here and here are all the available TensorFlow optimizers you could use.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With