Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

tf.data.Dataset.from_tensor_slices, tensors and eager mode

Using Iris dataset example:

train_ds_url = "http://download.tensorflow.org/data/iris_training.csv"

Imports used:

import tensorflow as tf
import pandas as pd
import numpy as np
tf.enable_eager_execution()

I downloaded the dataset and then I used pd.read to represent train_plantfeatures, train_categories arrays.

categories='Plants'

train_path = tf.keras.utils.get_file(train_ds_url.split('/')[-1], train_ds_url)

train = pd.read_csv(train_path, names=ds_columns, header=0)
train_plantfeatures, train_categories = train, train.pop(categories)

After that I used tf.contrib.keras.utils.to_categorical to create the categorical representation.

y_categorical = tf.contrib.keras.utils.to_categorical(train_categories, num_classes=3)

When I tried to use tf.data.Dataset and from_tensor_slices

dataset = tf.data.Dataset.from_tensor_slices((train_plantfeatures, y_categorical))

I received:

ValueError: Can't convert non-rectangular Python sequence to Tensor.

Same implementation without eager mode works perfectly.Here the Colab example

like image 403
Nicolas Bortolotti Avatar asked Oct 22 '25 16:10

Nicolas Bortolotti


1 Answers

The from_tensor_slices() method receives as input a Numpy array. But in this case, the variable train_plantfeatures is a Pandas DataFrame.

type(train_plantfeatures)
`Out:` pandas.core.frame.DataFrame

To make this work, add .values to convert from Pandas to Numpy:

dataset = tf.data.Dataset.from_tensor_slices((train_plantfeatures.values,
                                              y_categorical))

Do same for the test_plantfeatures variable:

dataset_test = tf.data.Dataset.from_tensor_slices((test_plantfeatures.values, 
                                                   y_categorical_test))
like image 57
Ekaba Bisong Avatar answered Oct 25 '25 04:10

Ekaba Bisong



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!