- Mais recentes
- Mais votos
- Mais comentários
Hi,
The solution can be various and it depends on what you are trying to achieve. In general, I believe it is a good idea to build a generic pipeline and utilize parameters for different jobs. The image below shows a typical ML pattern with stages.
You can use condition steps to orchestrate Sagemaker jobs, more information with code examples are below:
Hope it helps,
Hi,
To schedule the batch transformation job with the latest approved model, using a Lambda with an EventBridge rule is (as of today) the best way to achieve their use-case - please find the custom project template for batch inference you could leverage here:
https://github.com/aws-samples/sagemaker-custom-project-templates/tree/main/batch-inference
In addition you could also add a Transform Step at the end of their SageMaker Pipelines to automatically rely on BatchTransform to achieve their inference once training is done. Reference is here:
https://docs.aws.amazon.com/sagemaker/latest/dg/build-and-manage-steps.html#step-type-transform
Thanks,
@Jady - thanks. I went through the samples you have provided and looks straightforward. looks like adding a transformer and transformer step , in my current notebook should work. also, is there a way i can extend this, meaning if i copy paste this part of the code to my each notebook, it will work. but i also want to be able to reuse this code in different projects/pipeline. so that i don't have to copy this to every notebook. do i set up a different pipeline alltogether , just for this and utilize the parameters that i can set at the runtime for my different experiments , but how do i connect my training pipeline with my batch testing pipeline , in this case. or is there a different way or better way to do this
Hi,
I am glad you find the answer is useful.
I have some suggestions which might help, though I I did not connect multiple sagemaker pipelines together.
Please check AWS EventBridge -
- The EventBridge event can be configured to trigger a sagemaker pipeline state changes, reference is here: https://docs.aws.amazon.com/sagemaker/latest/dg/automating-sagemaker-with-eventbridge.html#eventbridge-pipeline
- EventBridge event can trigger you consequent sagemaker pipeline, reference: https://docs.aws.amazon.com/sagemaker/latest/dg/pipeline-eventbridge.html#pipeline-eventbridge-schedule
Alternatively, you may use a Lambda as last step of pipeline to triggers next pipeline but you will have overhead of managing the lambda.
Thanks,
thanks again.
Conteúdo relevante
- AWS OFICIALAtualizada há 2 anos
- AWS OFICIALAtualizada há 2 anos
@Jady - thank you again. I went through both of the docs and it helped. based on your second comment/docs, all of the processing step , training step , model creation, register, transform ... , all are defined and executed as part of a single pipeline. based on my use case, i would like to have separate pipelines for some of these steps, let's say one pipeline for processing and training, then another pipeline for model creation and registering , and another for testing and so forth. is that possible and if yes, how would one tie these pipelines/artifacts from one pipeline to another ? apologize for adding more question, but promise this is the last thing i'm wondering