- Neueste
- Die meisten Stimmen
- Die meisten Kommentare
Verify AWS CLI configuration and S3 access
Ensure that you have installed the AWS CLI on your system and configured access to your AWS account via access keys in the IAM service console.
Once you have access through the AWS CLI, verify that you have permission to read the objects inside the S3 bucket using the following command:
Replace S3_BUCKET_NAME with the name of your bucket.
aws s3 ls s3://S3_BUCKET_NAME/
Grant S3 permissions if denied
If you do not have access to the S3 bucket, ensure that your IAM user/role has an attached IAM policy granting permission to read and write objects, along with listing the bucket using the AWS Management Console.
Example IAM policy:
Replace S3_BUCKET_NAME with the name of your bucket.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::S3_BUCKET_NAME",
"arn:aws:s3:::S3_BUCKET_NAME/*"
]
}
]
}
List all files with Boto3
Once you have verified the AWS CLI has access, you can list all of the files in your S3 bucket using the following python script:
Replace S3_BUCKET_NAME with the name of your bucket.
import boto3
# Create a boto3 S3 client instance
s3 = boto3.client('s3')
# Get list of all objects and directories in the S3 bucket
response = s3.list_objects_v2(Bucket='S3_BUCKET_NAME')
# Iterate over the objects and directories
for obj in response['Contents']:
# Check if the object is not a directory
if not obj['Key'].endswith('/'):
# Print the file name
print(obj['Key'])
Download files with Boto3
To download a specific file from the S3 bucket using boto3, use the following python script:
Replace S3_BUCKET_NAME with the name of the S3 bucket.
Replace S3_BUCKET_OBJECT_NAME with the name and path of the object in the S3 bucket.
Replace LOCAL_FILE_NAME with the name you like to assign to the downloaded file. The file will be downloaded to your current directory.
import boto3
s3 = boto3.client('s3')
s3.download_file('S3_BUCKET_NAME', 'S3_BUCKET_OBJECT_NAME', 'LOCAL_FILE_NAME')
Example:
import boto3
s3 = boto3.client('s3')
s3.download_file('s3-bucket', 'dir1/dir2/dir3/dir4/file.txt', 'localFile.txt')
Upload files with Boto3
To upload a file to the S3 bucket using boto3, use the following python script:
Replace LOCAL_FILE_NAME with the name of the file on your local system that you want to upload.
Replace S3_BUCKET_NAME with the name of the S3 bucket you would like to upload to.
Replace S3_BUCKET_OBJECT_NAME with the name and path that you would like to store the file as.
import boto3
s3 = boto3.client('s3')
s3.upload_file('LOCAL_FILE_NAME', 'S3_BUCKET_NAME', 'S3_BUCKET_OBJECT_NAME')
Example:
import boto3
s3 = boto3.client('s3')
s3.upload_file('localFile.txt', 's3-bucket', 'dir1/dir2/dir3/dir4/newFile.txt')
References
Create a IAM user access key with the AWS Management Console:
https://docs.aws.amazon.com/powershell/latest/userguide/pstools-appendix-sign-up.html
Configure AWS CLI with access key:
Grant IAM user/role permissions to the S3 bucket:
List objects in a S3 bucket:
Download files using Boto3:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-example-download-file.html
Upload files using Boto3:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html
Are there any errors when running Python?
Also, if possible, would you be willing to share your Python code?
No, I don't know how to write code to traverse in the s3 bucket folder structure using python. If you have any articles related to this can you share with me?
Relevanter Inhalt
- AWS OFFICIALAktualisiert vor 3 Jahren
- AWS OFFICIALAktualisiert vor 2 Jahren
Could you provide more details? Do you get any errors when traversing through folders?