I'm guessing that DNN
in the sense used in TensorFlow
means "deep neural network". But I find this deeply confusing since the notion of a "deep" neural network seems to be in wide use elsewhere to mean a network with typically several convolutional and/or associated layers (ReLU, pooling, dropout, etc).
In contrast, the first instance many people will encounter this term (in the tfEstimator Quickstart example code) we find:
# Build 3 layer DNN with 10, 20, 10 units respectively.
classifier = tf.estimator.DNNClassifier(feature_columns=feature_columns,
hidden_units=[10, 20, 10],
n_classes=3,
model_dir="/tmp/iris_model")
This sounds suspiciously shallow, and even more suspiciously like an old-style multilayer perceptron (MLP) network. However, there is no mention of DNN
as an alternative term on that close-to-definitive source. So is a DNN
in the TensorFlow tf.estimator
context actually an MLP
? Documentation on the hidden_units
parameter suggests this is the case:
That has MLP written all over it. Is this understanding correct? Is DNN
therefore a misnomer, and if so should DNNClassifier
ideally be deprecated in favour of MLPClassifier
? Or does DNN
stand for something other than deep neural network?
A deep neural network (DNN) is a specific form of artificial neural network, used within Tensorflow, consisting of a number of layers between an input (features) and an output (target or predictor).
Estimators simplify sharing implementations between model developers. You can develop a great model with high-level intuitive code, as they usually are easier to use if you need to create models compared to the low-level TensorFlow APIs. Estimators are themselves built on tf. keras.
A DNNRegressor is similar, but instead of predicting a category, it predicts a numeric value in a continuous range. If you want an application to predict tomorrow's stock price, you'd create a DNNRegressor. A DNNEstimator can serve as a DNNClassifier or a DNNRegressor depending on how you configure it.
Give me your definition of "deep" neural network and you get your answer.
But yes, it is simply a MLP and a proper naming would be MLPclassifier indeed. But this does not sounds as cool as the current name.
First of all your definition of DNN is a bit misleading.
There are several architectures of deep neural networks. Inclussive Deep Feedforward Networks is nothing more than a multilayered MLP, plus some techniques to make them attractive.
Some works have used "DNNs" to span all Deep Learning architectures, however, by convention, "DNNs" are used to refer to architectures that use deep forward propagation networks, also called Deep Feedforward Networks
The most important example of a Deep Learning Model is the Profound Net Feedforward or Multilayer Perceptron (MLP). MLP is just a mathematical function that maps some sets of input values to output values. The function is formed by the composition of many simpler functions. You can relate each application of a different mathematical function to provide a new representation of the input.
Therefore, it makes sense that this estimator is called DNNClassifier
My advice is to read this book here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With