- Newest
- Most votes
- Most comments
Hey Ken,
As you said since, the input_header
does not have access to the HTTP headers, its context parameter only contains information about the model and the system environment. A potential workaround could be to include the metadata in the body of the request along with the binary content, and then parse it in the input_handler
or input_fn
function. However, I realize this would require changes to both the client sending the request and the inference script so not sure if thats workable in your case, but just a thought.
-Zac
Hi Ken,
If you would like to add custom values from your HTTP header to invoke your SageMaker Endpoint, you can have a look into using CustomAttributes when invoking your Endpoint as in [1,2]. CustomAttributes can be used to provide additional information about an inference request or response. Amazon SageMaker strips all POST headers except those supported by the InvokeEndpoint API action and you can use the CustomAttributes header to pass metadata to the inference request or response.
I hope this helps and should you still face issues, you can reach out to AWS Premium Support for further assistance. Kindly note that AWS Support can provide guidance but wouldn't be capable of re-coding your function as it's out of scope.
References
[1] https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_runtime_InvokeEndpoint.html [2] https://aws.amazon.com/blogs/machine-learning/amazon-sagemaker-runtime-now-supports-the-customattribute-header/
Relevant content
- asked 8 months ago
- Accepted Answerasked 2 years ago
- asked a year ago
- Accepted Answerasked a year ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago