I tried to use the BigQueryReader from Tensorflow but I don't succeed in actually reading the data. This is my code:
import tensorflow as tf
from tensorflow.contrib.cloud.python.ops.bigquery_reader_ops import BigQueryReader
import time
features = dict(
weight_pounds=tf.FixedLenFeature([1], tf.float32),
mother_age=tf.FixedLenFeature([1], tf.float32),
father_age=tf.FixedLenFeature([1], tf.float32),
gestation_weeks=tf.FixedLenFeature([1], tf.float32))
millis = int(round(time.time() * 1000))
reader = BigQueryReader(project_id="bigquery-public-data",
dataset_id="samples",
table_id="natality",
timestamp_millis=millis,
num_partitions=10,
features=features)
queue = tf.train.string_input_producer(reader.partitions())
row_id, examples_serialized = reader.read(queue)
examples = tf.parse_example(examples_serialized, features=features)
When executing this code sample I get:
File "/home/juta/.local/lib/python2.7/site-packages/tensorflow/python/framework/common_shapes.py", line 659, in _call_cpp_shape_fn_impl
raise ValueError(err.message)
ValueError: Shape must be rank 1 but is rank 0 for 'ParseExample_3/ParseExample' (op: 'ParseExample') with input shapes: [], [0], [], [], [], [], [0], [0], [0], [0].
The parsing is probably failing because reader.read(queue) seems to return empty objects:
ReaderRead(key=<tf.Tensor 'ReaderRead:0' shape=() dtype=string>, value=<tf.Tensor 'ReaderRead:1' shape=() dtype=string>)
Why is the reader not returning any data?
The reader is not returning empty objects: it is returning scalars (i.e. tensors with rank 0, or an "empty" shape). See the TensorFlow programmers guide on tensor shapes for more details.
The shape error "Shape must be rank 1 but is rank 0" indicates that the tf.parse_example() op expects a vector (rank 1 tensor) as input, rather than a scalar. There are at least two possible solutions:
tf.parse_single_example() op, which expects a scalar input, instead.reader.read() into a vector, for example using tf.expand_dims(examples_serialized, 0).If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With