Data capture on sagemaker severless inference endpoints

0

I can see that data capture is supported only for real time and batch transformations endpoints. Is there any suggested work around for serverless sagemaker inference endpoints. I would like to set up data monitoring and model monitors for serverless inference endpoint.

vik
已提問 7 個月前檢視次數 250 次
1 個回答
1

Hi,

Here are my thoughts, I haven't yet validate this.

If you want to do data capture on a serverless inference endpoint you would need to build own solution that mimics what the data capture feature does namely capturing the API requests, converting it to the same format as the data capture feature and storing it in S3. You can get this format from deploying a standard endpoint and monitoring its output.

This would mean that your inference container will need to to serialise and write each request to S3 in the same format as standard monitoring job would.

After the data is in S3 you would need to manually kick off the data/model monitoring jobs.

An alternative (non-AWS way) would be to look at directly integrating a 3rd party service within your Inference container such as whylabs.ai

已回答 7 個月前
  • Thanks, Marcus. I've started building a custom solution to mimic the data capture feature provided in real time endpoints. Just wanted to make sure, I wasn't missing any obvious solutions.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南