I'm using the code from the MNIST tutorial:
feature_columns = [tf.contrib.layers.real_valued_column("", dimension=4)]
classifier = tf.contrib.learn.DNNClassifier(feature_columns=feature_columns,
hidden_units=[10, 20, 10],
n_classes=2,
model_dir="/tmp/iris_model")
classifier.fit(x=np.array(train, dtype = 'float32'),
y=np.array(y_tr, dtype = 'int64'),
steps=2000)
accuracy_score = classifier.evaluate(x=np.array(test, dtype = 'float32'),
y=y_test)["auc"]
print('AUC: {0:f}'.format(accuracy_score))
from tensorflow.contrib.learn import SKCompat
ds_test_ar = np.array(ds_test, dtype = 'float32')
ds_predict_tf = classifier.predict(input_fn = _my_predict_data)
print('Predictions: {}'.format(str(ds_predict_tf)))
but at the end I got the following result instead of the predictions:
Predictions: <generator object DNNClassifier.predict.<locals>.<genexpr> at 0x000002CE41101CA8>
What did I do wrong?
What you received and saved to ds_predict_tf
is a generator expression.
To print it you can do:
for i in ds_predict_tf:
print i
or
print(list(ds_predict_tf))
You can read more about genexpr here.
The DNNClassifier predict function by default have as_iterable=True. Thus, it returns an generator. For getting values of predictions instead of generator, pass as_iterable=False in classifier.predict method.
For example,
classifier.predict(input_fn = _my_predict_data,as_iterable=False)
For understanding more about classifier methods and arguments. Here is a part of documentation for predict method.
From DNNClassifier documentation:
Args:
Returns:
Solution:-
pred = classifier.fit(x=training_set.data, y=training_set.target, steps=2000).predict(test_set.data)
print ("Predictions:")
print(list(pred))
That's it...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With