Can Glue Spark timezone be changed?

0

We have upstream systems that all use Central US time zone, but our pyspark/sparkSQL jobs in Glue is UTC and current_timestamp() is giving UTC time. Can we direct glue to use a different timezone? We tried adding a configuration to SparkConf: ("spark.sql.session.timeZone", "America/Chicago")

We also tried adding --java-options -Duser.timezone="America/Chicago" from the dashboard Run with Parameters feature.

Neither had the effect of updating spark's timezone. Any help here?

已提问 1 年前967 查看次数
1 回答
0

A timestamp doesn't have a timezone, by definition is based on UTC. The timezone you are configuring comes into play when you parse a date or you format that timestamp into a string.

If you do a "show()" on a timestamp column, you should see it in the timezone configured, if not maybe it's not correctly configured, notice that properly spark.sql.session.timeZone has to be set for SparkSession, not context.

profile pictureAWS
专家
已回答 1 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则