I have converted voice to spectrogram using librosa. The shape of spectogram is (257, 356), which i have reshaped to (257, 356, 1).
I have created a model
from keras.models import Sequential
from keras.layers import Dense, Conv2D, Flatten
model = Sequential()
model.add(Conv2D(64, kernel_size=3, activation='relu', input_shape=A.shape))
model.add(Flatten())
model.add(Dense(1, activation='softmax'))
while fitting the model, following error is produced
model.fit(A,validation_data=(A2), epochs=3)
where A2 is another spectrogram with following dimensions
ValueError: Error when checking input: expected conv2d_3_input to have 4 dimensions, but got array with shape (257, 356, 1)
Model Summary
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_24 (Conv2D) (None, 255, 354, 64) 640
_________________________________________________________________
conv2d_25 (Conv2D) (None, 253, 352, 32) 18464
_________________________________________________________________
flatten_11 (Flatten) (None, 2849792) 0
_________________________________________________________________
dense_11 (Dense) (None, 10) 28497930
=================================================================
Total params: 28,517,034
Trainable params: 28,517,034
Non-trainable params: 0
And the shape of A[0] is
A[0].shape = (356, 1)
EDIT: Here's my working code:
from keras.models import Sequential
from keras.layers import Dense, Conv2D, Flatten
import numpy as np
A = np.zeros((1,257,356,1)) # Only for illustration
A2 = np.zeros((1,1)) # Only for illustration
model = Sequential()
model.add(Conv2D(64, kernel_size=(3,3), activation='relu', input_shape=A.shape[1:])) # input_shape ==> (257,356,1)
model.add(Flatten())
model.add(Dense(1, activation='softmax'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(A, A2, validation_data = (A, A2), epochs=3)
And here's the output for 3 epochs:
Train on 1 samples, validate on 1 samples
Epoch 1/3
1/1 [==============================] - 0s 250ms/step - loss: 0.0000e+00 - accuracy: 1.0000 - val_loss: 0.0000e+00 - val_accuracy: 1.0000
Epoch 2/3
1/1 [==============================] - 0s 141ms/step - loss: 0.0000e+00 - accuracy: 1.0000 - val_loss: 0.0000e+00 - val_accuracy: 1.0000
Epoch 3/3
1/1 [==============================] - 0s 156ms/step - loss: 0.0000e+00 - accuracy: 1.0000 - val_loss: 0.0000e+00 - val_accuracy: 1.0000
<keras.callbacks.callbacks.History at 0x1d508dbb708>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With