Can Glue Spark timezone be changed?

0

We have upstream systems that all use Central US time zone, but our pyspark/sparkSQL jobs in Glue is UTC and current_timestamp() is giving UTC time. Can we direct glue to use a different timezone? We tried adding a configuration to SparkConf: ("spark.sql.session.timeZone", "America/Chicago")

We also tried adding --java-options -Duser.timezone="America/Chicago" from the dashboard Run with Parameters feature.

Neither had the effect of updating spark's timezone. Any help here?

已提問 1 年前檢視次數 967 次
1 個回答
0

A timestamp doesn't have a timezone, by definition is based on UTC. The timezone you are configuring comes into play when you parse a date or you format that timestamp into a string.

If you do a "show()" on a timestamp column, you should see it in the timezone configured, if not maybe it's not correctly configured, notice that properly spark.sql.session.timeZone has to be set for SparkSession, not context.

profile pictureAWS
專家
已回答 1 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南