Optimizing AWS Glue Job for Faster S3 Writes of Large Datasets

0

I have an AWS Glue job that transfers data from a PostgreSQL database to Amazon S3. The job functioned efficiently until the size of the data increased. Now, when attempting to save approximately 2-3 GB of data to S3, it takes over an hour, which is significantly slower than desired.

The bottleneck appears to be in the following section of the code, specifically during the write operation to S3:

 dynamic_frame = glueContext.create_dynamic_frame.from_catalog(
    database="pg-to-s3",
    table_name=f"stats_prod_{table_name}",
    additional_options={"partitionPredicate": f"'{partition_name}'"},
)

After incorporating this change, I proceed with writing the dynamic frame to S3 as follows:

frame = frame.repartition(24)
    glueContext.write_dynamic_frame.from_options(
        frame=frame,
        connection_type="s3",
        format="glueparquet",
        connection_options={"path": s3_path},
        format_options={"compression": "snappy"},
    )

I've isolated the problem to this block through print-based debugging, where timestamps indicated a significant delay during the execution of these lines.

I'm looking for suggestions on how to optimize this part of the code to reduce the time taken to write large datasets (2-3 GB) to S3. Are there best practices or specific strategies I can employ to enhance the efficiency of this data transfer process in AWS Glue?

Any advice or insights would be greatly appreciated, especially from those who have tackled similar challenges with AWS Glue and large data transfers to S3.

itamar
已提问 1 个月前178 查看次数
1 回答
0

Don't be confused by the lines execution, the writing triggers the whole processing but it doesn't mean writing files is slow.
In your case probably the bottleneck is reading from Postgres, see this: https://docs.aws.amazon.com/glue/latest/dg/run-jdbc-parallel-read-job.html

profile pictureAWS
专家
已回答 1 个月前
profile picture
专家
已审核 1 个月前
  • but i dont use a complex query at all. i just take the oldest partition, write it to the s3 and drop it. im pretty sure it doesnt takes so much time.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则