I've got a bunch of images in a format similar to Cifar10 (binary file, size = 96*96*3
bytes per image), one image after another (STL-10 dataset). The file I'm opening has 138MB.
I tried to read & check the contents of the Tensors containing the images to be sure that the reading is done right, however I have two questions -
FixedLengthRecordReader
load the whole file, however just provide inputs one at a time? Since reading the first size
bytes should be relatively fast. However, the code takes about two minutes to run. sess.run(uint8image)
, however the result is empty.The code is below:
import tensorflow as tf def read_stl10(filename_queue): class STL10Record(object): pass result = STL10Record() result.height = 96 result.width = 96 result.depth = 3 image_bytes = result.height * result.width * result.depth record_bytes = image_bytes reader = tf.FixedLengthRecordReader(record_bytes=record_bytes) result.key, value = reader.read(filename_queue) print value record_bytes = tf.decode_raw(value, tf.uint8) depth_major = tf.reshape(tf.slice(record_bytes, [0], [image_bytes]), [result.depth, result.height, result.width]) result.uint8image = tf.transpose(depth_major, [1, 2, 0]) return result # probably a hack since I should've provided a string tensor filename_queue = tf.train.string_input_producer(['./data/train_X']) image = read_stl10(filename_queue) print image.uint8image with tf.Session() as sess: result = sess.run(image.uint8image) print result, type(result)
Output:
Tensor("ReaderRead:1", shape=TensorShape([]), dtype=string) Tensor("transpose:0", shape=TensorShape([Dimension(96), Dimension(96), Dimension(3)]), dtype=uint8) I tensorflow/core/common_runtime/local_device.cc:25] Local device intra op parallelism threads: 4 I tensorflow/core/common_runtime/local_session.cc:45] Local session inter op parallelism threads: 4 [empty line for last print] Process finished with exit code 137
I'm running this on my CPU, if that adds anything.
EDIT: I found the pure TensorFlow solution thanks to Rosa. Apparently, when using the string_input_producer
, in order to see the results, you need to initialize the queue runners. The only required thing to add to the code above is the second line from below:
... with tf.Session() as sess: tf.train.start_queue_runners(sess=sess) ...
Afterwards, the image in the result
can be displayed with matplotlib.pyplot.imshow(result)
. I hope this helps someone. If you have any further questions, feel free to ask me or check the link in Rosa's answer.
TensorFlow is an open source library created for Python by the Google Brain team. TensorFlow compiles many different algorithms and models together, enabling the user to implement deep neural networks for use in tasks like image recognition/classification and natural language processing.
Just to give a complete answer:
filename_queue = tf.train.string_input_producer(['/Users/HANEL/Desktop/tf.png']) # list of files to read reader = tf.WholeFileReader() key, value = reader.read(filename_queue) my_img = tf.image.decode_png(value) # use png or jpg decoder based on your files. init_op = tf.global_variables_initializer() with tf.Session() as sess: sess.run(init_op) # Start populating the filename queue. coord = tf.train.Coordinator() threads = tf.train.start_queue_runners(coord=coord) for i in range(1): #length of your filename list image = my_img.eval() #here is your image Tensor :) print(image.shape) Image.fromarray(np.asarray(image)).show() coord.request_stop() coord.join(threads)
Or if you have a directory of images you can add them all via this Github source file
@mttk and @salvador-dali: I hope it is what you need
According to the documentation you can decode JPEG/PNG images.
It should be something like this:
import tensorflow as tf filenames = ['/image_dir/img.jpg'] filename_queue = tf.train.string_input_producer(filenames) reader = tf.WholeFileReader() key, value = reader.read(filename_queue) images = tf.image.decode_jpeg(value, channels=3)
You can find a bit more info here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With