Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow error in Colab - ValueError: Shapes (None, 1) and (None, 10) are incompatible

I'm trying to execute a small code for NN using the MNIST dataset for characters recognition. When it comes to the fit line I get ValueError: Shapes (None, 1) and (None, 10) are incompatible

import numpy as np

#Install Tensor Flow
try:
  #Tensorflow_version solo existe en Colab
  %tensorflow_version 2.x

except Exception:
  pass

import tensorflow as tf

tf.__version__

mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()

print(x_train.shape)
print(x_test.shape)
print(y_train.shape)
print(y_test.shape)
print(np.unique(y_train))
print(np.unique(y_test))

import matplotlib.pyplot as plt
plt.imshow(x_train[0], cmap='Greys');

y_train[0]

x_train, x_test = x_train / 255.0, x_test / 255.0
x_train.shape

model = tf.keras.Sequential([
                           tf.keras.layers.Flatten(input_shape=(28, 28)),
                           tf.keras.layers.Dense(units=512, activation='relu'),
                           tf.keras.layers.Dense(units=10, activation='softmax')
])
model.summary()
model.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
h = model.fit(x_train, y_train, epochs=10, batch_size=256)

I get an error in the last line, like if x_train and y_train would be of different size. But X_train is 60000x28x28 and y_train is 60000x1


Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten (Flatten)            (None, 784)               0         
_________________________________________________________________
dense (Dense)                (None, 512)               401920    
_________________________________________________________________
dense_1 (Dense)              (None, 10)                5130      
=================================================================
Total params: 407,050
Trainable params: 407,050
Non-trainable params: 0
_________________________________________________________________
Epoch 1/10
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-10-50705bca2031> in <module>()
      6 model.summary()
      7 model.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
----> 8 h = model.fit(x_train, y_train, epochs=10, batch_size=256)

10 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
    966           except Exception as e:  # pylint:disable=broad-except
    967             if hasattr(e, "ag_error_metadata"):
--> 968               raise e.ag_error_metadata.to_exception(e)
    969             else:
    970               raise

ValueError: in user code:

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py:571 train_function  *
        outputs = self.distribute_strategy.run(
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/distribute/distribute_lib.py:951 run  **
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/distribute/distribute_lib.py:2290 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/distribute/distribute_lib.py:2649 _call_for_each_replica
        return fn(*args, **kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py:533 train_step  **
        y, y_pred, sample_weight, regularization_losses=self.losses)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/compile_utils.py:205 __call__
        loss_value = loss_obj(y_t, y_p, sample_weight=sw)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py:143 __call__
        losses = self.call(y_true, y_pred)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py:246 call
        return self.fn(y_true, y_pred, **self._fn_kwargs)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py:1527 categorical_crossentropy
        return K.categorical_crossentropy(y_true, y_pred, from_logits=from_logits)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/backend.py:4561 categorical_crossentropy
        target.shape.assert_is_compatible_with(output.shape)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/tensor_shape.py:1117 assert_is_compatible_with
        raise ValueError("Shapes %s and %s are incompatible" % (self, other))

    ValueError: Shapes (None, 1) and (None, 10) are incompatible

like image 865
maestro73 Avatar asked Jun 18 '20 11:06

maestro73


2 Answers

You need to one hot encode your y_train vectors before passing them to the fit method. You can do that using the following code:

from keras.utils import to_categorical

# make the model and load the training dataset.

y_train = to_categorical(y_train)

# call the fit method.
like image 163
Tirth Patel Avatar answered Nov 15 '22 02:11

Tirth Patel


The issue is here:

model.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])

The loss, categorical_crossentropy expects one-hot encoded vectors for the classes, as described here. However your labels are not one hot encoded. In this case the simplest solution would be to use loss='sparse_categorical_crossentropy' as your labels are sparse.

like image 39
strider0160 Avatar answered Nov 15 '22 03:11

strider0160