I'm use python keras to build a cnn model.
I follow cnn mnist example and modify to my code. This is the example I found
# Read MNIST data
(X_Train, y_Train), (X_Test, y_Test) = mnist.load_data()
# Translation of data
X_Train40 = X_Train.reshape(X_Train.shape[0], 28, 28, 1).astype('float32')
X_Test40 = X_Test.reshape(X_Test.shape[0], 28, 28, 1).astype('float32')
My data has 30222 rows and 6 columns of csv.
Which is 10074 data each data is 3 * 6 size for one block of information.
For example, the 1 ~ 3row of the matrix is one block of information.
Then I changed the format of my data.
X_Train40 = X_Train.reshape(10074, 3, 6, 1)
X_Test40 = X_Test.reshape(4319, 3, 6, 1)
Then this error occurs.
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-133-4f23172d450a> in <module>()
----> 1 X_Train40 = X_Train.reshape(10074, 3, 6, 1)
2 X_Test40 = X_Test.reshape(4319, 3, 6, 1)
~\Anaconda3\lib\site-packages\numpy\matrixlib\defmatrix.py in __array_finalize__(self, obj)
269 return
270 elif (ndim > 2):
--> 271 raise ValueError("shape too large to be a matrix.")
272 else:
273 newshape = self.shape
ValueError: shape too large to be a matrix.
Just guessing, but since the data comes from a csv file, it was converted to np.matrix
, which have the restriction to be 2-dimensional.
Internally numpy will try to keep the dimensions of the matrix, so to reshape to higher dimensions, you will need to convert it to a ndarray
like this:
X_Train = np.array(X_Train)
X_Test = np.array(X_Test)
X_Train40 = X_Train.reshape(10074, 3, 6, 1)
X_Test40 = X_Test.reshape(4319, 3, 6, 1)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With