Lets say I have a simple neural network with an input layer and a single convolution layer programmed in tensorflow:
# Input Layer
input_layer = tf.reshape(features["x"], [-1, 28, 28, 1])
# Convolutional Layer #1
conv1 = tf.layers.conv2d(
inputs=input_layer,
filters=32,
kernel_size=[5, 5],
padding="same",
activation=tf.nn.relu)
I leave out any further parts of the network definitions for the features
.
If I wanted to add an LSTM Layer after this convolution layer, I would have to make the convolution layer TimeDistributed (in the language of keras) and then put the output of the TimeDistributed layer into the LSTM.
Tensorflow offers access to the keras layers in tf.keras.layers. Can I use the keras layers directly in the tensorflow code? If so, how? Could I also use the tf.keras.layers.lstm for the implementation of the LSTM Layer?
So in general: Is a mixture of pure tensorflow code and keras code possible and can I use the tf.keras.layers?
Yes, this is possible.
Import both TensorFlow and Keras and link your Keras session to the TF one:
import tensorflow as tf
import keras
from keras import backend as K
tf_sess = tf.Session()
K.set_session(tf_sess)
Now, in your model definition, you can mix TF and Keras layers like so:
# Input Layer
input_layer = tf.reshape(features["x"], [-1, 28, 28, 1])
# Convolutional Layer #1
conv1 = tf.layers.conv2d(
inputs=input_layer,
filters=32,
kernel_size=[5, 5],
padding="same",
activation=tf.nn.relu)
# Flatten conv output
flat = tf.contrib.layers.flatten(conv1)
# Fully-connected Keras layer
layer2_dense = keras.layers.Dense(128, activation='relu')(flat)
# Fully-connected TF layer (output)
output_preds = tf.layers.dense(layer2_dense, units=10)
This answer is adopted from a Keras blog post by Francois Chollet.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With