I try to excute this code and I get the error bellow, I get the error in the random function and I don't know how to fix it, please help me.
def load_data(sample_split=0.3, usage='Training', to_cat=True, verbose=True,
classes=['Angry','Happy'], filepath='C:/Users/Oussama/Desktop/fer2013.csv'):
df = pd.read_csv(filepath)
# print df.tail()
# print df.Usage.value_counts()
df = df[df.Usage == usage]
frames = []
classes.append('Disgust')
for _class in classes:
class_df = df[df['emotion'] == emotion[_class]]
frames.append(class_df)
data = pd.concat(frames, axis=0)
rows = random.sample(data.index, int(len(data)*sample_split))
data = data.ix[rows]
print ('{} set for {}: {}'.format(usage, classes, data.shape))
data['pixels'] = data.pixels.apply(lambda x: reconstruct(x))
x = np.array([mat for mat in data.pixels]) # (n_samples, img_width, img_height)
X_train = x.reshape(-1, 1, x.shape[1], x.shape[2])
y_train, new_dict = emotion_count(data.emotion, classes, verbose)
print (new_dict)
if to_cat:
y_train = to_categorical(y_train)
return X_train, y_train, new_dict
I get this:
Traceback (most recent call last):
File "fer2013datagen.py", line 71, in <module>
verbose=True)
File "fer2013datagen.py", line 47, in load_data
rows = random.sample(data, int(len(data)*sample_split))
File"
C:\Users\Oussama\AppData\Local\Programs\Python\Python35\lib\random.py",
line 311, in sample
raise TypeError("Population must be a sequence or set. For dicts, use
list(d).")
TypeError: Population must be a sequence or set. For dicts, use list(d).
Your code here:
rows = random.sample(data.index, int(len(data)*sample_split))
But, error message shows
rows = random.sample(data, int(len(data)*sample_split))
Why different? Did you modify it? And what is the type of data is? is it a list? or a dict?
And, error message has already told you how to fix it. it means the first parameter of random.sample must be a sequence or set. For dicts, use list(Dict).
For example,
d = {'a':1,'b':2}
random.sample(list(d), 1)
instead of
random.sample(d, 1)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With