I'm training some data on different classifiers. I didn't have a problem until I updated all my packages and python itself couple of days ago. The warning is showing itself just on Kneighbor classifier and because I'm using a huge loop and Jupyter, I can't see the result because for every loop there is this warning: ...
sklearn/neighbors/_classification.py:237: FutureWarning: Unlike other reduction functions (e.g. `skew`, `kurtosis`), the default behavior of `mode` typically preserves the axis it acts along. In SciPy 1.11.0, this behavior will change: the default value of `keepdims` will become False, the `axis` over which the statistic is taken will be eliminated, and the value None will no longer be accepted. Set `keepdims` to True or False to avoid this warning.
mode, _ = stats.mode(_y[neigh_ind, k], axis=1)
This is my code:
n_fold = 200
k_range = range(1,100)
misclassification = np.zeros((n_fold,len(k_range)))
for i in range(n_fold):
x = gender.drop(["Lead"], axis=1).values
y = gender["Lead"].values
x_train, x_test, y_train, y_test = skl_ms.train_test_split(x, y , test_size= 0.2)
for j ,k in enumerate(k_range):
model = skl_nb.KNeighborsClassifier(n_neighbors=k, )
model.fit(x_train, y_train)
prdct = model.predict(x_test)
misclassification[i,j] = np.mean(prdct!= y_test)
plts = np.linspace(1, 200, 200)
plt.plot(plts, misclassification, '.')
plt.title("K Fold Classification")
plt.ylabel('Misclassification')
plt.xlabel('number of neighbors')
plt.show()
mean_misclas= np.mean(misclassification, axis = 0)
min_prdct = min(mean_misclas)
for m in range(len(mean_misclas)):
if mean_misclas[m] == min_prdct:
ind = m
break
min_k = ind + 1
model = skl_nb.KNeighborsClassifier(n_neighbors=min_k)
model.fit(x_train, y_train)
prdct = model.predict(x_test)
result = np.mean(prdct!= y_test)
print('misclassification is: %.3f' %result)
print('accuracy is: %.3f' %np.mean(prdct == y_test))
I'm also using Ada classifier and I'm not getting the same warning. This is the code:
x = gender.drop(["Lead"], axis=1).values y = gender["Lead"].values x_train, x_test, y_train, y_test = skl_ms.train_test_split(x, y , test_size= 0.2)
modelAda = AdaBoostClassifier() modelAda.fit(x_train, y_train) predict = modelAda.predict(x_test)
print('misclassification: %.3f' % np.mean(predict != y_test))
print('accuracy is: %.3f' %np.mean(predict == y_test))
Got the same problem. For me, the easiest way to get rid of these messages is like this:
from warnings import simplefilter
simplefilter(action='ignore', category=FutureWarning)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With