I'm trying to use a local training job in SageMaker.
Following this AWS notebook (https://github.com/awslabs/amazon-sagemaker-examples/blob/master/sagemaker-python-sdk/mxnet_gluon_mnist/mxnet_mnist_with_gluon_local_mode.ipynb) I was able to train and predict locally.
There is any way to train locally and save the trained model in the Amazon SageMaker Training Job section? Otherwise, how can I properly save trained models I trained using local mode?
SageMaker Pipelines local mode is an easy way to test your training, processing and inference scripts, as well as the runtime compatibility of pipeline parameters before you execute your pipeline on the managed SageMaker service. By using local mode, you can test your SageMaker pipeline locally using a smaller dataset.
There is no way to have your local mode training jobs appear in the AWS console. The intent of local mode is to allow for faster iteration/debugging before using SageMaker for training your model.
You can create SageMaker Models from local model artifacts. Compress your model artifacts into a .tar.gz
file, upload that file to S3, and then create the Model (with the SDK or in the console).
Documentation:
As @lauren said, just compress it and creates your model. Once you local trained it, you don’t have to save it as a training job since you already have the artifacts for a model.
Training jobs are a combination of input_location, output_location, chosen algorithm, and hyperparameters. That’s what is saved on a training job and not a trained model. When a training job completes, it actually compress the artifacts and save your model in Amazon S3 so you can create a Model out of it.
So, since you trained locally (instead of decoupling the training step), create a model with the compressed artifacts, then create an endpoint, and do some inferences.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With