System Manager Automation Document restriction to use the S3 PutObject


I'm trying to put up a reporting feature that utilises couple of AWS API calls done via the aws:executeAwsApi automation action. When getting to the PutObject operation, there is an error which doesn't seem to be described very well:

Step fails when it is Execute/Cancelling action. PutObject API is not supported. Please refer to Automation Service Troubleshooting Guide for more diagnosis details.

The FailureType is Verification , FailureStage is Invocation and the VerificationErrorMessage is PutObject API is not supported. I can't seem to find any reference about this neither in the System Manager Automation Troubleshooting Guide , nor in the aws:executeAwsApi page

I know how to achieve the goal using boto3 and Lambda, but in the current case I'm trying to reach the end goal via SSM automation document. If there is a limitation to use certain API actions, why is this not in the docs (or I can't seem to find it)?

2 Risposte

It's possible that the restriction on using the S3 PutObject operation in System Manager Automation documents is not explicitly documented. However, there are some limitations to what AWS API operations can be used in Automation documents, and it's possible that PutObject is one of them.

One way to work around this limitation would be to use a Lambda function instead of an Automation document, as you mentioned. Another option would be to use the AWS CLI command "aws s3 cp" instead of PutObject to copy the file to an S3 bucket. You can use the "aws:executeAwsApi" action to run the CLI command.

Here's an example of how to use "aws s3 cp" in an Automation document:

  "description": "Copy a file to S3",
  "schemaVersion": "0.3",
  "assumeRole": "{{ assumeRole }}",
  "mainSteps": [
      "name": "CopyFileToS3",
      "action": "aws:executeAwsApi",
      "inputs": {
        "Service:": "ssm",
        "Api": "runShellScript",
        "Parameters": {
          "commands": [
            "aws s3 cp /path/to/local/file s3://my-bucket/path/to/destination/file"

This example assumes that you have already configured your AWS CLI credentials on the instance where the Automation document will be run. If you need to specify credentials, you can add them to the "commands" parameter using the "--profile" option.

I hope this helps! Let me know if you have any further questions.

con risposta un anno fa

This is a valid solution if you are running the automation over an instance. However, I'm trying to run it with a target of AWS account (no EC2 involved), with a role that I supply externally. I made it work, using boto3 as a wrapper of the last action of publishing the report to S3 (again, I didn't want to utilise Lambda to full extend in this solution). The final step looks like this:

def script_handler(events, context):
          import boto3
          import json
          data = events['json']
          s3 = boto3.client('s3')
          file = json.dumps(data)

This is aws:executeScript step where I provide the json input from the output of last step, similar to this:

inputpayload: json: '{{jstepName.stepOutput}}'

As a final word, I also managed to decode the official docs. There is a word about streaming operations (like PutObject): Most API operations are supported, although not all API operations have been tested. Streaming API operations, such as the GetObject operation, aren't supported. If you're not sure if an API operation you want to use is a streaming operation, review the Boto3 documentation for the service to determine if an API requires streaming inputs.

con risposta un anno fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande