I am attempting to write a simple deep machine learning model using TensorFlow. I'm using a toy dataset I made up in Excel just to get the model working and accepting data. My code is as follows:
import pandas as pd
import numpy as np
import tensorflow as tf
raw_data = np.genfromtxt('ai/mock-data.csv', delimiter=',', dtype=str)
my_data = np.delete(raw_data, (0), axis=0) #deletes the first row, axis=0 indicates row, axis=1 indicates column
my_data = np.delete(my_data, (0), axis=1) #deletes the first column
policy_state = tf.feature_column.categorical_column_with_vocabulary_list('policy_state', [
'AL', 'CA', 'MI'
])
modern_classic_ind = tf.feature_column.categorical_column_with_vocabulary_list('modern_classic_ind', [
'0', '1'
])
h_plus_ind = tf.feature_column.categorical_column_with_vocabulary_list('h_plus_ind', [
'0', '1'
])
retention_ind = tf.feature_column.categorical_column_with_vocabulary_list('retention_ind', [
'0', '1'
])
feature_columns = [
tf.feature_column.indicator_column(policy_state),
tf.feature_column.indicator_column(modern_classic_ind),
tf.feature_column.indicator_column(h_plus_ind)
]
classifier = tf.estimator.DNNClassifier(feature_columns=feature_columns,
hidden_units=[10, 20, 10],
n_classes=3,
model_dir="/tmp/ret_model")
train_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": np.array(my_data[:, 0:3], dtype=str)},
y=np.array(np.array(my_data[:, 3], dtype=str)),
num_epochs=None,
shuffle=True)
classifier.train(input_fn=train_input_fn, steps=2000)
Unfortunately, I am getting the following error. I have tried trimming the labels off the csv file versus leaving them, naming the feature columns different things, and changing the type of the numpy array. The error persists.
ValueError: Feature h_plus_ind is not in features dictionary.
If I remove h_plus_ind
, it simply throws the error on a different column.
If you use your using an already existing dataset, it is advised to rename the columns.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With