Environment variables for PySpark executor in AWS EMR Serverless and Env key limitations with EMR

0

Hello, I have gone documentation and practically observed the limitation for ENV Keys spark.emr-serverless.driverEnv and spark.emr-serverless.executorEnv with EMR Serverless which is limited to 50 max. I have more than 100 Env set on Code and need to pass to spark submit job. With EMR we were using --properties-file, How can we do it with EMR Serverless. Any help?

Ashwath
asked 21 days ago63 views
No Answers

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions