Skip to content

HOW TO: train on Sagemaker and deploy on Local? #1164

@Dgaylard

Description

@Dgaylard

Please fill out the form below.

System Information

  • **Framework (Sagemaker?) / Algorithm (KMeans)
  • Python Version: 3/7
  • CPU or GPU: Trained on GPU's using RecordIO input format.
  • Python SDK Version: Boto3?
  • Are you using a custom image: AWS Kmeans algo

Describe the problem

Hey there, after trying to get this running for awhile now I'm here.

Basically I have trained Kmeans models on Sagemaker (Great!) However I now want to deploy them locally.

Now I have the standard Kmeans artefacts output, a model.gz file that I:

Download from my S3:

s3_client.download_file('mybucket', myfile/path/model.tar.gz',
                                 '/tmp/model.tar.gz')

Extract:
os.system('tar -zxvf model.tar.gz')

Now I want to just take the extracted items (model_algo-1, state_ac4243fa-9838-41d2-b8d0-29601c73fdc3) and load them into a Kmeans object so I can actually infer with it locally.

I understand the primary of serving these models is through sagemaker but it doesn't seem much to ask that I can deploy it locally as a part of a much larger object?

Any help would be great, currently I've just gotten the following:

import mxnet as mx
eudexCluster = mx.ndarray.load('/tmp/model_algo-1')
cluster_centroids=pd.DataFrame(eudexCluster[0].asnumpy())

In addition

What exactly is this state_ac4243fa-9838-41d2-b8d0-29601c73fdc3 file? Seems to me like it's a checkpoint of somekind?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions