- 최신
- 최다 투표
- 가장 많은 댓글
@Durga_S Complete notebook. It can be refactored and made into python module probably. Compute wise it isn't big enough to require ml.m5.large. The cheapest route is probably through ec2 which is what I need to explore.
Unfortunately, there is no option to use ml.t3.medium to schedule a SageMaker Studio Job. The smallest instance type for this feature is ml.m5.large which you have already mentioned.
What are you hoping to achieve inside this notebook of yours? If it's to process some data then I would suggest trying out SageMaker processing job.
Processing job's intended purpose it to fetch raw data from S3 (or other sources), run the script that processes the data then sends the data back to S3. You'll be billed for only the duration of the job in seconds and there is an option to use ml.t3.medium as well.
Below is a code sample you can use as reference should you choose to use processing job: https://github.com/aws/amazon-sagemaker-examples/blob/main/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/scikit_learn_data_processing_and_model_evaluation.ipynb
관련 콘텐츠
- AWS 공식업데이트됨 2년 전
- AWS 공식업데이트됨 10달 전
SageMaker training jobs don't support
t3.medium
instances. Are you running notebooks or Python scripts? If it's simple enough, could you use Lambda or other serverless options?