How to set Environment variable in AWS EMR using SSM to be used by pyspark scripts

0

I am using emr-6.12.0 and trying to set environment varibles which are stored in secret manager in bootstrap.sh file.

SECRET_NAME="/myapp/dev/secrets"
SECRETS_JSON=$(aws secretsmanager get-secret-value --secret-id $SECRET_NAME --query SecretString --output text)

# Parse the secrets and set them as environment variables
for key in $(echo "$SECRETS_JSON" | jq -r "keys[]"); do
  value=$(echo "$SECRETS_JSON" | jq -r ".$key // empty" | sed 's/"/\\"/g')
  echo "$value"
  if [ ! -z "$value" ]; then
    export "$key"="$value"
  fi
done

I am able to see these values in log.

but when I try to access these variables from my pyspark script, I am not able to get these env variables.

os.environ.get("POSTGRES_URL") // Returns None

for key, value in os.environ.items():
    self.logger.info(f"{key}: {value}") // not able to see my env variables

As I am new to EMR and spark, please help me to know how can I set my env variables from SSM to EMR.

vivek
질문됨 6달 전339회 조회
1개 답변
0
profile pictureAWS
답변함 6달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠