Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TypeError: Input 'filename' of 'ReadFile' Op has type float32 that does not match expected type of string

Tags:

python

keras

I am running this code from the tutorial here: https://keras.io/examples/vision/image_classification_from_scratch/

with a custom dataset, that is divided in 2 datasets as in the tutorial. However, I got this error:

TypeError: Input 'filename' of 'ReadFile' Op has type float32 that does not match expected type of string.

I made this casting. I tried this:

is_jfif = str(tf.compat.as_bytes("JFIF")) in fobj.peek(10)

but nothing changed as far as the error I am trying all day to figure out how to solve it, without any success. Can someone help me? Thank you...

like image 668
just_learning Avatar asked Jun 14 '20 21:06

just_learning


3 Answers

Simplest way I found is to create a subfolder and copy the files to that subfolder. i.e. Lets assume your files are 0.jpg, 1.jpg,2.jpg....2000.jpg and in directory named "patterns".

Seems like the Keras API does not accept it as the files are named by numbers and for Keras it is in float32.

To overcome this issue, either you can rename the files as one answer suggests, or you can simply create a subfolder under "patterns" (i.e. "patterndir"). So now your image files are under ...\patterns\patterndir

Keras (internally) possibly using the subdirectory name and may be attaching it in front of the image file thus making it a string (sth like patterndir_01.jpg, patterndir_02.jpg) [Note this is my interpretation, does not mean that it is true]

When you compile it this time, you will see that it works and you will get a compiler message as:

Found 2001 files belonging to 1 classes.
Using 1601 files for training.
Found 2001 files belonging to 1 classes.
Using 400 files for validation.

My code looks like this

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers

#Generate a dataset

image_size = (28, 28)
batch_size = 32

train_ds = tf.keras.preprocessing.image_dataset_from_directory(
    "patterns",
    validation_split=0.2,
    subset="training",
    seed=1337,
    image_size=image_size,
    batch_size=batch_size,
)
val_ds = tf.keras.preprocessing.image_dataset_from_directory(
    "patterns",
    validation_split=0.2,
    subset="validation",
    seed=1337,
    image_size=image_size,
    batch_size=batch_size,
)
like image 150
Matt Allen Avatar answered Oct 11 '22 05:10

Matt Allen


In my case, I simply did not have enough samples in the training directories. There was one per category and I got the error.

like image 4
Shtefan Avatar answered Oct 11 '22 05:10

Shtefan


Just make a subdirectory and move your files there.

So if the files are here:

'/home/dataset_28/'

Put them here:

'/home/dataset_28/files/'

And then do this:

from tensorflow.keras.preprocessing import image_dataset_from_directory
image_dataset_from_directory('/home/dataset_28/', batch_size=1, image_size=(28, 28))
like image 2
Gabriel Avatar answered Oct 11 '22 05:10

Gabriel