Data capture on sagemaker severless inference endpoints

0

I can see that data capture is supported only for real time and batch transformations endpoints. Is there any suggested work around for serverless sagemaker inference endpoints. I would like to set up data monitoring and model monitors for serverless inference endpoint.

vik
已提问 7 个月前249 查看次数
1 回答
1

Hi,

Here are my thoughts, I haven't yet validate this.

If you want to do data capture on a serverless inference endpoint you would need to build own solution that mimics what the data capture feature does namely capturing the API requests, converting it to the same format as the data capture feature and storing it in S3. You can get this format from deploying a standard endpoint and monitoring its output.

This would mean that your inference container will need to to serialise and write each request to S3 in the same format as standard monitoring job would.

After the data is in S3 you would need to manually kick off the data/model monitoring jobs.

An alternative (non-AWS way) would be to look at directly integrating a 3rd party service within your Inference container such as whylabs.ai

已回答 7 个月前
  • Thanks, Marcus. I've started building a custom solution to mimic the data capture feature provided in real time endpoints. Just wanted to make sure, I wasn't missing any obvious solutions.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则