I want to load the data from the directory where I have around 5000 images (type 'png'). But it returns me an error saying that there are no images when obviusly there are images. This code:
width=int(wb-wa)
height=int(hb-ha)
directory = '/content/drive/My Drive/Colab Notebooks/Hair/Images'
train_ds = tf.keras.preprocessing.image_dataset_from_directory(
directory, labels=densitat, label_mode='int',
color_mode='rgb', batch_size=32, image_size=(width, height), shuffle=True, seed=1,
validation_split=0.2, subset='training', follow_links = False)
Returns:
ValueError: Expected the lengths of `labels` to match the number of files in the target directory. len(labels) is 5588 while we found 0 files in /content/drive/My Drive/Colab Notebooks/Hair/Images.
I can see the images: Colab view of the folder structure with the images
Where is the problem? I need to use this function to load data in batchs as i have a large dataset
I have found the answer so I am posting in case it might help someone.
The problrem is the path, as I was using the path to the folder with the images whereas I should have used the directory (one folder above).
directory = '/content/drive/My Drive/Colab Notebooks/Hair'
Note that '/Hair' is the folder with my images.
If the accepted solution above doesn't solve your problem, it could be because you are trying to load TIFF images with a .tif
extension. It turns out the only allowed formats for image_dataset_from_directory
are ('.bmp', '.gif', '.jpeg', '.jpg', '.png')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With