So i have been working on a notebook on Google Colab, and all of a sudden i get the following error.
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-bd6ec74ccf2e> in <module>()
----> 1 from keras.utils import to_categorical
ImportError: cannot import name 'to_categorical' from 'keras.utils' (/usr/local/lib/python3.7/dist-packages/keras/utils/__init__.py)
---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.
To view examples of installing some common dependencies, click the
"Open Examples" button below.
It's very strange since it was working just fine, and when i restarted my session, this happened. I tried with another google account too (in case there might be something wrong with my account's setup), but i still got the same error.
This is what i use to import the function.
from keras.utils import to_categorical
I'm wondering if anything change, and if anyone else experiences the same issue. Thanks.
to_categorical function Converts a class vector (integers) to binary class matrix. E.g. for use with categorical_crossentropy . Arguments. y: Array-like with class values to be converted into a matrix (integers from 0 to num_classes - 1 ).
np_utils. to_categorical is used to convert array of labeled data(from 0 to nb_classes - 1 ) to one-hot vector. The official doc with an example. In [1]: from keras. utils import np_utils # from keras import utils as np_utils Using Theano backend.
from TF 2.0, it's been moved with tensorflow. please use this way:
from tensorflow.keras.utils import to_categorical
to_categorical([0, 1, 2, 3], num_classes=4)
result will be like
array([[1., 0., 0., 0.],
[0., 1., 0., 0.],
[0., 0., 1., 0.],
[0., 0., 0., 1.]], dtype=float32)
import tensorflow as tf
y_train_one_hot = tf.keras.utils.to_categorical(y_train)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With