I am trying to store my model to hdfs using python.
This code is by using pydoop library
import pydoop.hdfs as hdfs
from_path = prediction_model.fit(orginal_telecom_80p_train[features], orginal_telecom_80p_train["Churn"])
to_path ='hdfs://192.168.1.101:8020/user/volumata/python_models/churn_model.sav'
hdfs.put(from_path, to_path)
But, while using this, I am getting this error
AttributeError: 'LogisticRegression' object has no attribute 'startswith'
Then I tried using the pickle option
import pickle
with open('hdfs://192.168.1.101:8020/user/volumata/python_models/') as hdfs_loc:
pickle.dump(prediction_model, hdfs_loc)
Pickle option is working fine in local, when i tried to store the model in hdfs, this option is also not working for me. Can anyone please suggest how to proceed further for storing the models to hdfs by using python script?
You have to use hdfs.open instead of open, and open the file for writing:
import pickle
import pydoop.hdfs as hdfs
with hdfs.open(to_path, 'w') as f:
pickle.dump(prediction_model, f)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With