Environment variables for PySpark executor in AWS EMR Serverless and Env key limitations with EMR
0
Hello, I have gone documentation and practically observed the limitation for ENV Keys spark.emr-serverless.driverEnv and spark.emr-serverless.executorEnv with EMR Serverless which is limited to 50 max. I have more than 100 Env set on Code and need to pass to spark submit job. With EMR we were using --properties-file, How can we do it with EMR Serverless. Any help?