I am trying to use a keras neural network to recognize canvas images of drawn digits and output the digit. I have saved the neural network and use django to run the web interface. But whenever I run it, I get an internal server error and an error on the server side code. The error says Exception: Error when checking : expected dense_input_1 to have shape (None, 784) but got array with shape (784, 1). My only main view is
from django.shortcuts import render
from django.http import HttpResponse
import StringIO
from PIL import Image
import numpy as np
import re
from keras.models import model_from_json
def home(request):
if request.method=="POST":
vari=request.POST.get("imgBase64","")
imgstr=re.search(r'base64,(.*)', vari).group(1)
tempimg = StringIO.StringIO(imgstr.decode('base64'))
im=Image.open(tempimg).convert("L")
im.thumbnail((28,28), Image.ANTIALIAS)
img_np= np.asarray(im)
img_np=img_np.flatten()
img_np.astype("float32")
img_np=img_np/255
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
# evaluate loaded model on test data
loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
output=loaded_model.predict(img_np)
score=output.tolist()
return HttpResponse(score)
else:
return render(request, "digit/index.html")
The links I have checked out are:
Edit Complying with Rohan's suggestion, this is my stack trace
Internal Server Error: /home/
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 149, in get_response
response = self.process_exception_by_middleware(e, request)
File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 147, in get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/home/vivek/keras/neural/digit/views.py", line 27, in home
output=loaded_model.predict(img_np)
File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 671, in predict
return self.model.predict(x, batch_size=batch_size, verbose=verbose)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 1161, in predict
check_batch_dim=False)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 108, in standardize_input_data
str(array.shape))
Exception: Error when checking : expected dense_input_1 to have shape (None, 784) but got array with shape (784, 1)
Also, I have my model that I used to train the network initially.
import numpy
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.utils import np_utils
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)
(X_train, y_train), (X_test, y_test) = mnist.load_data()
for item in y_train.shape:
print item
num_pixels = X_train.shape[1] * X_train.shape[2]
X_train = X_train.reshape(X_train.shape[0], num_pixels).astype('float32')
X_test = X_test.reshape(X_test.shape[0], num_pixels).astype('float32')
# normalize inputs from 0-255 to 0-1
X_train = X_train / 255
X_test = X_test / 255
print X_train.shape
# one hot encode outputs
y_train = np_utils.to_categorical(y_train)
y_test = np_utils.to_categorical(y_test)
num_classes = y_test.shape[1]
# define baseline model
def baseline_model():
# create model
model = Sequential()
model.add(Dense(num_pixels, input_dim=num_pixels, init='normal', activation='relu'))
model.add(Dense(num_classes, init='normal', activation='softmax'))
# Compile model
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
return model
# build the model
model = baseline_model()
# Fit the model
model.fit(X_train, y_train, validation_data=(X_test, y_test), nb_epoch=20, batch_size=200, verbose=1)
# Final evaluation of the model
scores = model.evaluate(X_test, y_test, verbose=0)
print("Baseline Error: %.2f%%" % (100-scores[1]*100))
# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk")
Edit I tried reshaping the img to (1,784) and it also failed, giving the same error as the title of this question
Thanks for the help, and leave comments on how I should add to the question.
Keras model predict is the method available in keras that help us predict the outputs by performing various computations that are carried out in batches. This is a guide to Keras Model Predict. Here we discuss the Introduction, What is Keras model predict, examples with code implementation.
You can make these types of predictions in Keras by calling the predict_proba () function; for example: Xnew = [[...], [...]] In the case of a two-class (binary) classification problem, the sigmoid activation function is often used in the output layer.
Classification problems are those where the model learns a mapping between input features and an output feature that is a label, such as “ spam ” and “ not spam “. Below is an example of a finalized neural network model in Keras developed for a simple two-class (binary) classification problem.
Finalize Model Before you can make predictions, you must train a final model. You may have trained models using k-fold cross validation or train/test splits of your data. This was done in order to give you an estimate of the skill of the model on out of sample data, e.g. new data.
You're asking the neural network to evaluate 784 cases with one input each instead of a single case with 784 inputs. I had the same problem and I solved it having an array with a single element which is an array of the inputs. See the example below, the first one works whereas the second one gives the same error you're experiencing.
model.predict(np.array([[0.5, 0.0, 0.1, 0.0, 0.0, 0.4, 0.0, 0.0, 0.1, 0.0, 0.0]]))
model.predict(np.array([0.5, 0.0, 0.1, 0.0, 0.0, 0.4, 0.0, 0.0, 0.1, 0.0, 0.0]))
hope this solves it for you as well :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With