I have been working through the TensorFlow tutorials on their website. In the Text Classification with RNN exercise, I encounter this error. I have attempted to a few changes, and have also copy-&-pasted the code to receive the same error. Any suggestions? Thank you
I have tried shuffling the dataset AFTER assigning the padded_batch. I can see from the documentation, there is no attribute (output_shapes) for Shuffle. I cannot figure out an alternative approach.
BUFFER_SIZE = 10000
BATCH_SIZE = 64
train_dataset = train_dataset.shuffle(BUFFER_SIZE)
train_dataset = train_dataset.padded_batch(BATCH_SIZE, train_dataset.output_shapes)
test_dataset = test_dataset.padded_batch(BATCH_SIZE, test_dataset.output_shapes)
to receive this error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-7-8a10fa01af19> in <module>()
3
4 train_dataset = train_dataset.shuffle(BUFFER_SIZE)
----> 5 train_dataset = train_dataset.padded_batch(BATCH_SIZE, train_dataset.output_shapes)
6
7 test_dataset = test_dataset.padded_batch(BATCH_SIZE, test_dataset.output_shapes)
AttributeError: 'ShuffleDataset' object has no attribute 'output_shapes'
Try replacing
train_dataset = train_dataset.padded_batch(BATCH_SIZE, train_dataset.output_shapes)
with
train_dataset = train_dataset.padded_batch(BATCH_SIZE, tf.compat.v1.data.get_output_shapes(train_dataset))
it is not part of the question but you can also write your train_dataset all in one go, for example:
train_dataset = (
train_dataset
.shuffle(BUFFER_SIZE)
.padded_batch(BATCH_SIZE, tf.compat.v1.data.get_output_shapes(train_dataset)))
figured I would throw that in there just to give another way of writing it ;)
Also, as far as train_dataset.output_shapes goes, it has been depreciated in the latest version of TensorFlow https://www.tensorflow.org/api_docs/python/tf/data/Dataset#element_spec so if you have TF2 you can use compat.v1 or ds.element_spec
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With