I am working on building a machine learning pipeline for time series data where the goal is to retrain and update the model frequently to make predictions.
I am confused about how to use the same preprocessing code for both training and inference? Should I write a lambda function to preprocess my data or is there any other way
Sources looked into:
The two examples given by the aws sagemaker team use AWS Glue to do the ETL tranform.
inference_pipeline_sparkml_xgboost_abalone
inference_pipeline_sparkml_blazingtext_dbpedia
I am new to aws sagemaker trying to learn, understand and build the flow. Any help is appreciated!
Answering the problems in a backwards fashion.
From your example, The below piece of code is the inference pipeline where 2 models are put together. In here we need to remove sparkml_model and get our sklearn model.
sm_model = PipelineModel(name=model_name, role=role, models=[sparkml_model, xgb_model])
Before placing the sklearn model, we need the SageMaker version of SKLearn model.
First create the SKLearn Estimator using SageMaker Python library.
sklearn_preprocessor = SKLearn(
entry_point=script_path,
role=role,
train_instance_type="ml.c4.xlarge",
sagemaker_session=sagemaker_session)
script_path - this is python code that contains all the preprocessing logic or transformation logic. 'sklearn_abalone_featurizer.py' in the link given below.
Train the SKLearn Estimator
sklearn_preprocessor.fit({'train': train_input})
Create the SageMaker model from the SKLearn Estimator that can put in inference pipeline.
sklearn_inference_model = sklearn_preprocessor.create_model()
Inference PipeLineModel creation will be modified as indicated below.
sm_model = PipelineModel(name=model_name, role=role, models=[sklearn_inference_model, xgb_model])
For more details, refer the below link.
https://github.com/awslabs/amazon-sagemaker-examples/blob/master/sagemaker-python-sdk/scikit_learn_inference_pipeline/Inference%20Pipeline%20with%20Scikit-learn%20and%20Linear%20Learner.ipynb
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With