Skip to content

EXCEPTION: botocore.errorfactory.NoSuchBucket: An error occurred (NoSuchBucket) when calling the CreateMultipartUpload operation: The specified bucket does not exist

0

I am creating new stages in pipeline. One of them is fe-beta . I had ran bb bootstrap only once previously. While deploying my changes, all stacks are being deployed correctly but getting error on Approval Workflow-beta

22:41:53.334 Streaming data to: s3://codetest-us-west-2-298037688727-do-not-delete/deployment/fes-hydra-workflow-FEAmazon-us-west-2-test5079284.zip
22:41:55.694 EXCEPTION: botocore.errorfactory.NoSuchBucket: An error occurred (NoSuchBucket) when calling the CreateMultipartUpload operation: The specified bucket does not exist [bats_workflow.private.publisher] Traceback (most recent call last): File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_publisher_lib/s3.py", line 40, in _publish_artifact artifact_destination = self.write_file_to_destination( File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_publisher_lib/base.py", line 264, in write_file_to_destination props = s3_uploader.upload_file(artifact, key) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_workflow_common/local_metrics.py", line 85, in wrapper raise e File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_workflow_common/local_metrics.py", line 81, in wrapper result = func(*args, **kwargs) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_publisher_lib/s3_uploader.py", line 115, in upload_file destination = self._stream_to_s3(artifact_stream, key, size) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_workflow_common/local_metrics.py", line 85, in wrapper raise e File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_workflow_common/local_metrics.py", line 81, in wrapper result = func(*args, **kwargs) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_publisher_lib/s3_uploader.py", line 180, in _stream_to_s3 response = self._multipart_upload( File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/bats_publisher_lib/s3_uploader.py", line 250, in _multipart_upload create_response = self._s3_client.create_multipart_upload(**create_multipart_upload_input) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/botocore/client.py", line 570, in _api_call return self._make_api_call(operation_name, kwargs) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/botocore/context.py", line 123, in wrapper return func(*args, **kwargs) File "/apollo/env/BATSWorkflow2/lib/python3.10/site-packages/botocore/client.py", line 1031, in _make_api_call raise error_class(parsed_response, operation_name) botocore.errorfactory.NoSuchBucket: An error occurred (NoSuchBucket) when calling the CreateMultipartUpload operation: The specified bucket does not exist
22:41:56.641 After categorizing, fault: CustomerFault; error PublishBundleFailed; reason: WriteToS3Error
1 Answer
0

The error you're encountering suggests that the S3 bucket specified for your pipeline's artifact storage does not exist or is not accessible. This is likely due to the pipeline trying to upload artifacts to a non-existent or inaccessible S3 bucket during the Approval Workflow-beta stage. Here are some steps to troubleshoot and resolve this issue:

  1. Verify the S3 bucket existence: Check if the bucket "codetest-us-west-2-298037688727-do-not-delete" exists in your AWS account. If it doesn't exist, you'll need to create it or update your pipeline configuration to use an existing bucket.

  2. Check bucket permissions: Ensure that the IAM role used by your pipeline has the necessary permissions to access and write to the specified S3 bucket. The role should have s3:PutObject and s3:CreateMultipartUpload permissions at a minimum.

  3. Review pipeline configuration: Double-check your pipeline configuration, especially for the new fe-beta stage, to make sure it's using the correct S3 bucket for artifact storage.

  4. Validate cross-account access: If you're working across multiple AWS accounts, ensure that the necessary cross-account permissions are in place for accessing the S3 bucket.

  5. Check for typos: Verify that there are no typos in the bucket name within your pipeline configuration.

  6. Rerun bootstrap: Since you mentioned running "bb bootstrap" only once previously, consider running it again to ensure all necessary resources, including the S3 bucket, are properly set up for your updated pipeline configuration.

  7. Review CloudWatch Logs: Check the CloudWatch logs for your pipeline to see if there are any additional error messages or clues about why the bucket access is failing.

  8. Verify region consistency: Ensure that the region specified in your pipeline configuration matches the region where the S3 bucket is located.

If after checking these points the issue persists, you may need to recreate the S3 bucket or update your pipeline configuration to use a different, existing bucket for artifact storage. Remember to update any relevant IAM policies to reflect any changes in bucket names or locations.
Sources
Glue Error: error occurred while calling o228.pyWriteDynamicFrame. com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.model.AmazonS3Exception: The specified bucket does not exist | AWS re:Post
Codepipline s3 source action does not work | AWS re:Post

answered 10 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Relevant content