AWS Glue python job pyspark libraries

0

How to include pyspark libraries when running a AWS Glue Python job.

已提問 1 年前檢視次數 321 次
1 個回答
2

The question isn't clear the way it is worded.

Let me clarify based on what I understand. If your question is how to include pyspark libraries in a AWS Glue Python Shell job, this cannot be done as the computing for this option does not involve drivers and operators that a Spark engine would need. This uses a single compute resource.

For AWS Glye Pyspark job, you import the libraries as usual. WHen you create a job from the console, it should provide you the default imports needed to start.

profile pictureAWS
已回答 1 年前
AWS
專家
已審閱 1 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南