Not able to read from raw bucket if I try to access the files within subfolders

0

Here is my IAM role permissions while reading the buckets : "sig": "AllowPermissions", "Effect": "ALLOW" "Action": [ "§3:Getobject" w3:ListBUcket" "§3:Listobjects", "§3:ListobjectsV2" 1. "Resource": [ "Fn: :Sub": "${bucket1}/" "Fn: :SUb": "${bucket2}/" }. "Fn::Sub": "${bucket3} /*" ] I have a method which calls while reading the data from raw bucket using spark reader . def listAllFilesFromSubFolders (spark: Sparksession, s3path: String) : Array[string] = { val hadoopConf = spark. sparkContext. hadoopConfiguration vat filesystem = Filesystem.get (new java.net. URI (spath), hadoopconf) def recursive (path: Path): Array [Path] = { val status = fileSystem. listStatus (path) I val files - status. filter (I_.isDirectory) .map (getPath) val directoryFiles - status. filter (.isDirectory) .map (-getPath) directoryFiles. flatMap(recursive) ++ files val allFiles = recursive (new Path (spath)) allFiles.filter(-getName.startswith("global")).map(_-tostring) } Facing exception while reading above method with this exception : java.io.IOException:com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.model.AmazonS3Exception: Copy Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: ; S3 Extended Request ID: /EH99x0PwpoQwyACxw8wv3dFVdPEz2GgqmrMOMf2yvhoI6QL4hzYSw=; Proxy: null), S3 Extended Request ID:j/EH99x0PwpoQCwyACxw8w3dFVdPEz2GgqmrMOMf2yvhOI6QL4hzY5w= at com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore.list(Jets3tNativeFilesystemStore.java:303) at com.amazon.ws.emr.hadoop.fs.s3n.sNativeFilesystem.liststatus(sNativeFileSystem.java:665) at com.amazon.ws.emr.hadoop.fs.s3n.sNativeFileSystem.listStatus(sNativefileSystem.java:636) at com.amazon.ws.emr.hadoop.fs.EmrFileSystem.listStatus(EmrFilesystem.java:473) at com.ics.utils.FilePathGenerator$.recursive$1(FilePathGenerator.scala:34) at com.ics.utils.FilePathGenerator$.listAllFilesFromSubFolders(FilePathGenerator.scala:40) at $.main(.scala:111) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method. java: 498) at com.amazonaws.services.glue.SparkProcessLauncherPlugin.invoke(ProcessLauncher.scala:48)

  • Hi, please update you question and use quote format </> to make your IAM policy readable by re:Post community.

Madhu
preguntada hace 8 meses242 visualizaciones
1 Respuesta
1

Hello,

From the error looks like a copy operation is being performed which is throwing the access denied error. Can you check if you have the below permission in your IAM policy on the required bucket because Copy operation uses below permissions.

s3:GetObject s3:PutObject S3:GetObjectTagging S3:PutObjectTagging

Also, If the object is encrypted using an AWS KMS key, then confirm that your IAM identity/role has the correct permissions to the key. If your IAM identity/role and AWS KMS key belong to the same account, then confirm that your key policy grants the required AWS KMS permissions. Moreover, there should not be explicit deny at bucket policy, please check for that as well.

For most common causes of Access Denied (403 Forbidden) errors in Amazon S3 please visit this link -
https://docs.aws.amazon.com/AmazonS3/latest/userguide/troubleshoot-403-errors.html

Hope this helps.

AWS
respondido hace 8 meses
profile pictureAWS
EXPERTO
kentrad
revisado hace 8 meses

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas