In SageMaker, how to get the value from Http Header?

0

I have a machine learning classification model that was trained outside of SageMaker. The model is in Scikit-learn format. To run this model, the preprocessing step requires the binary content of a file and its metadata, such as the filename, author, and creation date to generate the feature vector. Since the payload is used to send the binary content, I utilize the Header to send additional information to SageMaker.

Referring to this post, I created a function called input_handler(data, context) in the inference.py file to retrieve the custom header value. However, this function was not invoked during the process. So I am not sure if the function input_handler works for SKLearnModel or not. Do we have other alternatives to get values from the header?

I used the class sagemaker.sklearn.SKLearnModel for deploying the model."

This is how I deployed the model

from sagemaker.sklearn import SKLearnModel
from sagemaker import get_execution_role


role = get_execution_role()
model_path = f's3://{bucket_name}/{model_name}.tar.gz'
entry_point = 'inference_ransomware.py'
dependencies = ...

model = SKLearnModel(
    model_data=model_path,
    role=role,
    entry_point=entry_point,
    dependencies=dependencies,
    framework_version='1.2-1'
)

predictor = model.deploy(
    initial_instance_count=1,
    instance_type='ml.m4.xlarge',
)

filename = ...

with open(filename, mode='rb') as file:
    byte_data = file.read()
    predictor.predict(byte_data, initial_args={'CustomAttributes': json.dumps({"filename": filename})})
Ken
已提問 1 年前檢視次數 445 次
2 個答案
0

Hey Ken,

As you said since, the input_header does not have access to the HTTP headers, its context parameter only contains information about the model and the system environment. A potential workaround could be to include the metadata in the body of the request along with the binary content, and then parse it in the input_handler or input_fn function. However, I realize this would require changes to both the client sending the request and the inference script so not sure if thats workable in your case, but just a thought.

-Zac

profile picture
Zac Dan
已回答 1 年前
0

Hi Ken,

If you would like to add custom values from your HTTP header to invoke your SageMaker Endpoint, you can have a look into using CustomAttributes when invoking your Endpoint as in [1,2]. CustomAttributes can be used to provide additional information about an inference request or response. Amazon SageMaker strips all POST headers except those supported by the InvokeEndpoint API action and you can use the CustomAttributes header to pass metadata to the inference request or response.

I hope this helps and should you still face issues, you can reach out to AWS Premium Support for further assistance. Kindly note that AWS Support can provide guidance but wouldn't be capable of re-coding your function as it's out of scope.

References

[1] https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_runtime_InvokeEndpoint.html [2] https://aws.amazon.com/blogs/machine-learning/amazon-sagemaker-runtime-now-supports-the-customattribute-header/

已回答 1 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南