Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow: Extracting image and label from TFRecords file

Tags:

I have a TFRecords file which contains images with their labels, name, size, etc. My goal is to extract the label and the image as a numpy array.

I do the following to load the file:

def extract_fn(data_record):
    features = {
        # Extract features using the keys set during creation
        "image/class/label":    tf.FixedLenFeature([], tf.int64),
        "image/encoded":        tf.VarLenFeature(tf.string),
    }
    sample = tf.parse_single_example(data_record, features)
    #sample = tf.cast(sample["image/encoded"], tf.float32)
    return sample

filename = "path\train-00-of-10"
dataset = tf.data.TFRecordDataset(filename)
dataset = dataset.map(extract_fn)
iterator = dataset.make_one_shot_iterator()
next_element = iterator.get_next()

with tf.Session() as sess:
    while True:
        data_record = sess.run(next_element)
        print(data_record)

The image is saved as a string. How can I convert the image to float32? I tried sample = tf.cast(sample["image/encoded"], tf.float32) which does not work. I want data_record to be a list containing the image as a numpy-array and the label as a np.int32 number. How can I do that?

Right now data_record looks like this:

{'image/encoded': SparseTensorValue(indices=array([[0]]), values=array([b'\xff\xd8\ ... 8G\xff\xd9'], dtype=object), dense_shape=array([1])), 'image/class/label': 394}

I have no idea how I can work with that. I would appreciate any help

EDIT

If I print sample and sample['image/encoded'] in extract_fn() I get the following:

print(sample) = {'image/encoded': <tensorflow.python.framework.sparse_tensor.SparseTensor object at 0x7fe41ec15978>, 'image/class/label': <tf.Tensor 'ParseSingleExample/ParseSingleExample:3' shape=() dtype=int64>}

print(sample['image/encoded'] = SparseTensor(indices=Tensor("ParseSingleExample/ParseSingleExample:0", shape=(?, 1), dtype=int64), values=Tensor("ParseSingleExample/ParseSingleExample:1", shape=(?,), dtype=string), dense_shape=Tensor("ParseSingleExample/ParseSingleExample:2", shape=(1,), dtype=int64))

It seems that the image is a sparse tensor and tf.image.decode_image throws an error. What is the right way to extract the image as an tf.float32 tensor?

like image 877
Gilfoyle Avatar asked Feb 16 '19 14:02

Gilfoyle


1 Answers

I believe you store images encoded as JPEG or PNG or some other format. So, when reading, you have to decode them:

def extract_fn(data_record):
    features = {
        # Extract features using the keys set during creation
        "image/class/label":    tf.FixedLenFeature([], tf.int64),
        "image/encoded":        tf.VarLenFeature(tf.string),
    }
    sample = tf.parse_single_example(data_record, features)
    image = tf.image.decode_image(sample['image/encoded'], dtype=tf.float32) 
    label = sample['image/class/label']
    return image, label

...

with tf.Session() as sess:
    while True:
        image, label = sess.run(next_element)
        image = image.reshape(IMAGE_SHAPE)

Update: It seems you got your data as a single cell value in a sparse Tensor. Try to convert it back to dense and inspect before and after decoding:

def extract_fn(data_record):
    features = {
        # Extract features using the keys set during creation
        "image/class/label":    tf.FixedLenFeature([], tf.int64),
        "image/encoded":        tf.VarLenFeature(tf.string),
    }
    sample = tf.parse_single_example(data_record, features)
    label = sample['image/class/label']
    dense = tf.sparse_tensor_to_dense(sample['image/encoded'])

    # Comment it if you got an error and inspect just dense:
    image = tf.image.decode_image(dense, dtype=tf.float32) 

    return dense, image, label
like image 121
Dmytro Prylipko Avatar answered Sep 17 '22 22:09

Dmytro Prylipko