How to set Environment variable in AWS EMR using SSM to be used by pyspark scripts

0

I am using emr-6.12.0 and trying to set environment varibles which are stored in secret manager in bootstrap.sh file.

SECRET_NAME="/myapp/dev/secrets"
SECRETS_JSON=$(aws secretsmanager get-secret-value --secret-id $SECRET_NAME --query SecretString --output text)

# Parse the secrets and set them as environment variables
for key in $(echo "$SECRETS_JSON" | jq -r "keys[]"); do
  value=$(echo "$SECRETS_JSON" | jq -r ".$key // empty" | sed 's/"/\\"/g')
  echo "$value"
  if [ ! -z "$value" ]; then
    export "$key"="$value"
  fi
done

I am able to see these values in log.

but when I try to access these variables from my pyspark script, I am not able to get these env variables.

os.environ.get("POSTGRES_URL") // Returns None

for key, value in os.environ.items():
    self.logger.info(f"{key}: {value}") // not able to see my env variables

As I am new to EMR and spark, please help me to know how can I set my env variables from SSM to EMR.

vivek
preguntada hace 6 meses336 visualizaciones
1 Respuesta
0
profile pictureAWS
respondido hace 6 meses

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas