I'm running the following script:
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
from sklearn.preprocessing import OneHotEncoder
dataset = pd.read_csv('data/50_Startups.csv')
X = dataset.iloc[:, :-1].values
y = dataset.iloc[:, 4].values
onehotencoder = OneHotEncoder(categorical_features=3,
handle_unknown='ignore')
onehotencoder.fit(X)
The data head looks like: data
And I've got this:
ValueError: could not convert string to float: 'New York'
I read the answers to similar questions and then opened scikit-learn documentations, but how you can see scikit-learn authors doesn't have issues with spaces in strings
I know that I can use LabelEncocder
from sklearn.preprocessing
and then use OHE and it works well, but in that case
In case you used a LabelEncoder before this OneHotEncoder to convert the categories to integers, then you can now use the OneHotEncoder directly.
warnings.warn(msg, FutureWarning)
massage occurs.
You can use full csv file or
[[165349.2, 136897.8, 471784.1, 'New York', 192261.83],
[162597.7, 151377.59, 443898.53, 'California', 191792.06],
[153441.51, 101145.55, 407934.54, 'Florida', 191050.39],
[144372.41, 118671.85, 383199.62, 'New York', 182901.99],
[142107.34, 91391.77, 366168.42, 'Florida', 166187.94]]
5 first lines to test this code.
It is categorical_features=3
that hurts you. You cannot use categorical_features
with string data. Remove this option, and luck will be with you. Also, you probably need fit_transform
, not fit
as such.
onehotencoder = OneHotEncoder(handle_unknown='ignore')
transformed = onehotencoder.fit_transform(X[:, [3]]).toarray()
X1 = np.concatenate([X[:, :2], transformed, X[:, 4:]], axis=1)
#array([[165349.2, 136897.8, 0.0, '0.0, 1.0, 192261.83],
# [162597.7, 151377.59, 1.0, 0.0, 0.0, 191792.06],
# [153441.51, 101145.55, 0.0, 1.0, 0.0, 191050.39],
# [144372.41, 118671.85, 0.0, 0.0, 1.0, 182901.99],
# [142107.34, 91391.77, 0.0, 1.0, 0.0, 166187.94']])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With